当前位置: 当前位置:首页 >熱點 >【】正文

【】

作者:知識 来源:探索 浏览: 【】 发布时间:2024-11-22 00:54:25 评论数:

Facebook's been promising to fix discrimination problems with its ads platform ever since a Pro Publicareport last fall found that its targeting tools had the potential to violate civil rights laws.

A year later, those efforts don't seem to have amounted to much.

SEE ALSO:Here's how easy it is for anyone to use Facebook like Russian actors did

A follow-up investigation from the Pro Publicaon Tuesday found that Facebook still approved housing ads that excluded users based on their religion, gender, or "multicultural affinity" (Facebook's thinly veiled stand-in for race which was previously called "ethnic affinity"). Each of those demographics are considered protected groups under the Fair Housing Act.

After Pro Publica's report first broke last year, the social network attempted to ease the controversy with some bare-minimum changes to its platform and an internal probe a week later. The following February, it rolled out more tools meant to curb discrimination on the platform, including a machine learning system meant to detect when combinations of targeting categories might be discriminatory and warnings to businesses placing housing and employment ads.

Even so, the news site reported that each of the recent ads it took out to test Facebook's systems was approved within a few minutes. The only exception was an ad centered on users interested in Islam, which took close to half an hour.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Facebook blamed the problem on "a technical failure."

“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments," Facebook's vp of product management, Ami Vora, said in a statement. "We don’t want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. Our systems continue to improve but we can do better."

The company also announced that it would began expanding its warning system to all ads that are set up to exclude certain groups. The system previously only applied to housing, job, and credit ads.

Facebook also claims that its new software and additional human reviewers have managed to flag millions of discriminatory ads since they were deployed earlier this year.

The report follows another investigation by CNBC last week that found that Facebook was rife with posts advertising the illegal sale of prescription opioids and other dangerous drugs.

Reports like these, as well as the ongoing congressional investigation into the role Facebook played in a Kremlin-aligned campaign to influence the U.S. presidential election, have poked many holes in Facebook's claims that it strictly polices its advertising platform for nefarious actors.

This post has been updated to include a statement from Facebook.


Featured Video For You
Is social media getting in the way of our happiness?

TopicsFacebook