Disenfranchised By Bad Design
Study: Almost 40 Percent of U.S. Smartphone Owners Use Voice Recognition

Facebook Lets Advertisers Exclude Users by Race

Facebook logo [Editor's note: Today's guest post was originally published by ProPublica on October 28, 2016. It is reprinted with permission.]

by Julia Angwin and Terry Parris Jr., ProPublica

Imagine if, during the Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers.

That's basically what Facebook is doing nowadays.

The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls "Ethnic Affinities." Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.

Here is a screenshot of a housing ad that we purchased from Facebook's self-service advertising portal:

Image

The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an "affinity" for African-American, Asian-American or Hispanic people. (Here's the ad itself.)

When we showed Facebook's racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, "This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find."

The Fair Housing Act of 1968 makes it illegal "to make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin." Violators can face tens of thousands of dollars in fines.

The Civil Rights Act of 1964 also prohibits the "printing or publication of notices or advertisements indicating prohibited preference, limitation, specification or discrimination" in employment recruitment.

Facebook's business model is based on allowing advertisers to target specific groups 2014 or, apparently to exclude specific groups 2014 using huge reams of personal data the company has collected about its users. Facebook's microtargeting is particularly helpful for advertisers looking to reach niche audiences, such as swing-state voters concerned about climate change. ProPublica recently offered a tool allowing users to see how Facebook is categorizing them. We found nearly 50,000 unique categories in which Facebook places its users.

Facebook says its policies prohibit advertisers from using the targeting options for discrimination, harassment, disparagement or predatory advertising practices.

"We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law," said Steve Satterfield, privacy and public policy manager at Facebook. "We take prompt enforcement action when we determine that ads violate our policies."

Satterfield said it's important for advertisers to have the ability to both include and exclude groups as they test how their marketing performs. For instance, he said, an advertiser "might run one campaign in English that excludes the Hispanic affinity group to see how well the campaign performs against running that ad campaign in Spanish. This is a common practice in the industry."

He said Facebook began offering the "Ethnic Affinity" categories within the past two years as part of a "multicultural advertising" effort.

Satterfield added that the "Ethnic Affinity" is not the same as race 2014 which Facebook does not ask its members about. Facebook assigns members an "Ethnic Affinity" based on pages and posts they have liked or engaged with on Facebook.

When we asked why "Ethnic Affinity" was included in the "Demographics" category of its ad-targeting tool if it's not a representation of demographics, Facebook responded that it plans to move "Ethnic Affinity" to another section.

Facebook declined to answer questions about why our housing ad excluding minority groups was approved 15 minutes after we placed the order.

By comparison, consider the advertising controls that the New York Times has put in place to prevent discriminatory housing ads. After the newspaper was successfully sued under the Fair Housing Act in 1989, it agreed to review ads for potentially discriminatory content before accepting them for publication.

Steph Jespersen, the Times' director of advertising acceptability, said that the company's staff runs automated programs to make sure that ads that contain discriminatory phrases such as "whites only" and "no kids" are rejected.

The Times' automated program also highlights ads that contain potentially discriminatory code words such as "near churches" or "close to a country club." Humans then review those ads before they can be approved.

Jespersen said the Times also rejects housing ads that contain photographs of too many white people. The people in the ads must represent the diversity of the population of New York, and if they don't, he says he will call up the advertiser and ask them to submit an ad with a more diverse lineup of models.

But, Jespersen said, these days most advertisers know not to submit discriminatory ads: "I haven't seen an ad with 'whites only' for a long time."

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Chanson de Roland

Just when I thought that my reasons for detesting Facebook has been exhausted, I learn this. So in addition to violating our privacy by collecting our personal information and profiling us and then selling all of us, that is, our personal information and our profiles, to advertisers, marketers, political campaigns, etc., as if we were a bunch of field slaves at auction; and engaging in censorship of views and news stories in ways that have been shown to discriminate against conservatives, censor stories critical of Facebook, and otherwise promote the views of its Mark Zuckerberg and/or his senior managers; conduct tests on its users without their knowledge, much less their permission by controlling the content that they are exposed to, we know have Facebook as abettor and facilitator of racist advertising. To borrow, the phrase of one of the biggest beneficiaries of it political contributions, Hillary Clinton, Facebook should be in the basket of deplorables.

Now, that they have been called on it, I suppose that Facebook will have to do something other than be guided, as it almost always is, by the principles of doing what makes the most profit, as long as we can get away with it, and if it doesn't get away with it, it is better to ask for forgiveness rather than permission.

These social media companies are becoming way to powerful in our lives and in our politics. Thus, it is time to apply venerable legal principles to curb their power and wealth. First, it should be acknowledged that our personal information, that we author online, is our property in which we have a copyright; second, where companies have too much market share based not on a superior product and, thus, winning the battle of competition but merely because of what are known as network effects, antitrust law should be applied to break those companies up into independent competitive units with smaller market shares but which are interoperable, so that users on one social media company can fully interact with those on another such company; in addition to acknowledging our ownership of our personal information, as set forth supra, Congress should empower the FTC to protect our privacy from un-consented to intrusion and unfair commercial agreements; parents should be allowed to have access to and control of the Facebook accounts of their minor children; and Facebook should not be allowed to discriminate in its services against content based on its ideas, political philosophy, and or expression, provide that content and expression is decent, civil, lawful, and reasonable.

These are some of the reforms that are needed to prevent Facebook's pernicious social effects and violation of personal rights, which should also be applied to any social media that has significant market share in any relevant market.

George

Everyone:

Thanks to Roland for the comment. Yes, this is very troubling. There are valid applications to filter on race, such as advertisers selling food, hair care products, and such. It seems that too many Americans either don't know or have forgotten that there are both federal and state laws prohibiting discrimination in housing, which these FBook filters allegedly facilitate. Some people mistakenly believe that these housing filters are okay given recent trends in "big data" and data mining.

While some readers know this history, I think that it is important to remind all readers that the United States has a long, sordid history of people being discriminated against when trying to buy, sell, rent, or finance a place to live. I hope that today's blog post (and future posts) will encourage readers to familiarize themselves with anti-discrimination laws, so they can tell the difference and know their rights.

Some sources for people unfamiliar with anti-discrimination laws in housing:

What Kind of Housing Discrimination Is Illegal?
http://www.nolo.com/legal-encyclopedia/free-books/renters-rights-book/chapter5-2.html

And, the Federal law:

Fair Housing Act
https://www.justice.gov/crt/fair-housing-act-2

In its race to make money and maximize profits, it seems that Facebook did not think through instances, applications, or use models where the filters can be applied illegally. Or maybe Facebook's attorneys judged that the advertiser would be liable in court (and not Facebook) for using the filters illegally. Either way, ethically dubious.

George
Editor
http://ivebeenmugged.typepad.com

The comments to this entry are closed.