Facebook Removes Advertising Attributes, Claims Unrelated to Discrimination Accusations
Facebook has claimed that eliminating 5,000 options which enable advertisers to gear their ads toward certain audiences is unrelated to accusations that the social media site is fostering housing and employment discrimination. Yet, the elimination of what has been described as “sensitive personal attributes” came about just four days after the Department of Justice (DOJ) joined a lawsuit filed by fair housing groups in federal court in New York City concerning the matter. The lawsuit alleges advertisers could use Facebook’s audience options to prevent racial and religious minorities and other protected groups from viewing ads.
“We’ve been building these tools for a long time and collecting input from different outside groups,” Facebook spokesperson Joe Osborne said, adding the decision is unrelated to the allegations.
The Communications Decency Act of 1996 gives immunity to internet companies from liability for content on their platforms. However, the DOJ stated this does not apply to Facebook’s advertising portal even though the company has repeatedly cited the act in legal proceedings.
The Department of Housing and Urban Development filed a formal complaint against the social media giant around the same time the DOJ joined the litigation, and Facebook responded by saying its policies strictly prohibit discrimination, it has strengthened its systems over the past year, and that it will work with HUD to address any issues.
“The Fair Housing Act prohibits housing discrimination including those who might limit or deny housing options with a click of a mouse,” said Anna María Farías, HUD’s assistant secretary for fair housing and equal opportunity. “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”
In July, the state of Washington imposed legally binding compliance standards on Facebook, barring it from offering advertisers the option of excluding protected groups in housing, credit, employment, insurance ads or those geared toward “public accommodations of any kind.”
In August, Facebook said it discovered evidence of Russian and Iranian efforts to influence elections in the U.S. and around the world through fake accounts and targeted advertising. It also said it had suspended more than 400 apps “due to concerns around the developers who built them or how the information people chose to share with the app may have been used.”
Facebook has indicated the changes to the options, unrelated to the most recent claims, will be completed next month. According to the company, these categories have not been widely used by advertisers to discriminate, and their removal is intended to be proactive. In some cases, advertisers legitimately use these categories to reach key audiences. Facebook is not limiting advertisers’ options for narrowing audiences by sex or age.
A pending suit in federal court in San Francisco alleges that, by allowing employers to target audiences by age, Facebook is enabling employment discrimination against older candidates. Peter Romer-Friedman, an attorney representing the plaintiffs, said that Facebook’s removal of the 5,000 options “is a modest step in the right direction.” He added that allowing employers to restrict age, however, “shows what Facebook cares about: its bottom line. There is real money in age-restricted discrimination.”
Last November, Facebook added a self-certification option, which asks housing advertisers to check a box agreeing that their advertisement is non-discriminatory. The social media site also plans to require advertisers to read educational material on ethical practices moving forward.