Filed Under:  Business

Fair housing groups sue Facebook for allowing discrimination in housing ads

3rd April 2018   ·   0 Comments

By Julia Angwin and Ariana Tobin
ProPublica

In February 2017, in response to a ProPublica investigation, Facebook pledged to crack down on efforts by advertisers of rental housing to discriminate against tenants based on race, disability, gender and other characteristics.

But a new lawsuit, filed Tuesday by the National Fair Housing Alliance in U.S. District Court in the Southern District of New York, alleges that the world’s largest social network still allows advertisers to discriminate against legally protected groups, including mothers, the disabled and Spanish-language speakers.

Since 2018 marks the 50th anniversary of the Fair Housing Act, “it is all the more egregious and shocking” that “Facebook continues to enable landlords and real estate brokers to bar families with children, women and others from receiving rental and sales ads or housing,” the lawsuit states. It asks the court, among other things, to declare that Facebook’s policies violate fair housing laws, to bar the company from publishing discriminatory ads, and to require it to develop and make public a written fair housing policy for advertising.

Diane Houk, lead counsel for the alliance, said this type of discrimination is especially difficult to uncover and combat. “The person who is being discriminated against has no way to know” it, because the technology “keeps the discrimination hidden in hopes that it will not be caught,” she said.

Facebook disputes the housing groups’ allegations. “There is absolutely no place for discrimination on Facebook. We believe this lawsuit is without merit, and we will defend ourselves vigorously,” said Facebook spokesman Joe Osborne.

The lawsuit adds to Facebook’s woes, which are mounting on multiple fronts. The company’s stock plunged last week on the news that it had allowed a voter-profiling outfit, Cambridge Analytica, to obtain data on 50 million of its users without their knowledge or consent. The news came after a troubling year in which, among other things, Facebook admitted that it unwittingly allowed a Russian disinformation operation on its platform and had been promoting fake news in its News Feed algorithm. As a result, lawmakers and regulators around the world have launched investigations into Facebook.

Discrimination in housing advertising has been a persistent problem for Facebook. In October 2016, we described how Facebook let advertisers exclude specific groups with what it called “ethnic affinities,” including blacks and Hispanics, from seeing ads. Although Facebook responded by announcing it had built a system to flag and reject these ads, we bought dozens of rental housing ads in November 2017 that we specified would not be shown to blacks, Jews, people interested in wheelchair ramps and other groups.

It wasn’t until ProPublica brought the issue of advertising discrimination on Facebook to light, Houk said, that fair housing advocates learned of it. Emulating ProPublica’s technique, the Washington, D.C.-based national fair housing group, along with member groups in New York, San Antonio and Miami created fake housing companies and placed discriminatory ads on Facebook. The ads were approved by Facebook over a period of a few months, with the most recent buys occurring on Feb. 23.

Using Facebook’s dropdown “exclusion” menu, they were able to buy housing ads that blocked groups such as “trendy moms,” “soccer moms,” “parents with teenagers,” people interested in a disabled parking permit and people interested in Telemundo, the Spanish-language television network.

The Fair Housing Act makes it illegal to publish any advertisement “with respect to the sale or rental of a dwelling that indicates any preference, limitation or discrimination based on race, color, religion, sex, handicap, familial status or national origin.” Violators may face tens of thousands of dollars in fines.

After ProPublica’s investigation, Facebook added a self-certification option, which asks housing advertisers to certify that their advertisement is not discriminatory. In some cases, Houk said, the housing groups encountered the self-certification option, and did not submit the ads to Facebook for approval and publication. But that only happened in some of the ad buys, she said.

Since advertisers can falsely attest to fairness, the self-certification screens don’t “seem like a whole-hearted commitment to trying to change the advertising platform to comply with the Fair Housing Act and local fair housing laws,” Houk said.

A couple of weeks after the groups bought housing ads, so did ProPublica (independently) — and we excluded some of the same categories, such as “soccer moms.” In most of those tests, we encountered self-certification screens. However, when we bought another housing ad last week, we were able to exclude people interested in Telemundo.

Houk said there were so many possible explanations for the difference in results — such as the number of categories excluded or the types of exclusions sought — that it was impossible to speculate about what caused many of her clients’ ad purchases to be approved but not ProPublica’s.

Both the fair housing groups and ProPublica found that Facebook has blocked the use of race as an exclusion category — as it promised to do in November. Facebook rejected a ProPublica housing ad that was specifically aimed at African Americans. It also denied our attempts to buy employment ads targeted by race, and removed a job listing with a question designed to filter by race. However, the housing groups’ and ProPublica’s ability to exclude people interested in Telemundo suggests that advertisers could still discriminate by using proxies for race or ethnicity.

In a separate federal case in California, challenging Facebook’s use of racial exclusions in ad targeting, Facebook has argued that it has immunity against liability for such discrimination. It cited Section 230 of the 1996 federal Communica-tions Decency Act, which protects internet companies from liability for third-party content.

“Advertisers, not Facebook, are responsible for both the content of their ads and what targeting criteria to use, if any,” Facebook contended.

Madeleine Varner contributed to this report.

This article originally published in the April 2, 2018 print edition of The Louisiana Weekly newspaper.


Readers Comments (0)


You must be logged in to post a comment.