- ProPublica was recently able to buy rental ads on Facebook that illegally excluded African-Americans, Spanish speakers and other groups.
- The problem, first discovered in October 2016, was supposed to have been fixed in February, when the company adopted policies to screen housing ads more thoroughly.
ProPublica reported in October 2016 it was able to buy an ad in Facebook's housing category and exclude black, Hispanic or Asian-Americans from seeing it. Discriminating against specific races, colors, religions, sex, handicaps, familial status and national origin when it comes to housing ads is against the law under the Fair Housing Act. Facebook said it added new tools in February to prevent the issue.
But a new ProPublica report shows the publication bought housing ads recently -- and was still allowed to block groups including African Americans, people interested in wheelchair ramps, Jews, and Spanish speakers from seeing it. Most ads were approved within three minutes, except for an exclusion for people "interested in Islam, Sunni Islam and Shia Islam" which was also eventually approved.
While the ad bought in October 2016 was about a community meeting about housing, the recent housing ad was about rental apartments, which would make it illegal.
Facebook admitted it allowed the ads, and apologized for the error in a statement. Ami Vora, VP of product management, said the company will add a notification across all categories for anyone who places an ad that excludes certain groups of people. Purchasers will have to acknowledge they are posting in accordance with Facebook's policies and the law.
"This was a failure in our enforcement and we're disappointed that we fell short of our commitments," Vora said in the statement.
"Earlier this year, we added additional safeguards to protect against the abuse of our multicultural affinity tools to facilitate discrimination in housing, credit and employment. The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure. Our safeguards, including additional human reviewers and machine learning systems have successfully flagged millions of ads and their effectiveness has improved over time. Tens of thousands of advertisers have confirmed compliance with our tighter restrictions, including that they follow all applicable laws. We don't want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. Our systems continue to improve but we can do better. While we currently require compliance notifications of advertisers that seek to place ads for housing, employment, and credit opportunities, we will extend this requirement to ALL advertisers who choose to exclude some users from seeing their ads on Facebook to also confirm their compliance with our anti-discrimination policies – and the law."
A source close to the situation said there was a coding error which did not prevent discriminatory tactics from being used on rental ads. However housing ads that pertained to purchasing property were properly screened.
Facebook also faced scrutiny over its ad targeting capabilities in September, when another ProPublica report showed the platform allowed ads to be shown to people who identified as "Jew haters" and other anti-semitic terms. The company responded by adding more features to block the use of derogatory terms to target users, as well as hiring more people to look over ads before they post.