Facebook apologizes after Amnesty International labeled ‘primates’ on black men’s video: worldnews


0

September 3, 2021

Facebook users who recently viewed a video from a British newspaper showing black men saw an automated prompt from the social network asking if they would like to “continue watching videos about primates,” causing the company to investigate the AI-powered feature and disable Push Message .

On Friday, Facebook apologized for what it called an “unacceptable error” and said it was looking into the recommendation feature “to prevent this from happening again.”

The video, dated June 27, 2020, was published by the Daily Mail and showed clips of black men in scuffles with white civilians and police officers. It had nothing to do with monkeys or primates.

Darcy Groves, a former director of content design at Facebook, said a friend recently sent her a screenshot of the router. Then I posted it in a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking into the root cause.”

Ms Groves said the claim was “terrifying and appalling”.

“As we’ve said, while we’ve made improvements to our AI, we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations,” Facebook spokeswoman Danny Lever said in a statement.

Google, Amazon, and other tech companies have been under scrutiny for years for bias within their AI systems, particularly on issues of race. Studies have shown that facial recognition technology is biased against people of color and has more trouble recognizing them, leading to incidents where black people are discriminated against or arrested due to a computer error.

In one example in 2015, Google Photos mistakenly labeled photos of black people as “gorillas,” which Google said it was “really sorry” for and would immediately resolve the issue. More than two years later, Wired found that Google’s solution was to censor the word “gorilla” from searches, while banning “chimpanzees”, “chimpanzees” and “monkeys”.

Facebook has one of the largest repositories of user uploaded images in the world for training its facial and object recognition algorithms. The company, which designs content for users based on their past browsing and viewing habits, sometimes asks if they’d like to continue seeing posts under relevant categories. It was not clear whether messages such as “primates” were widespread.

Facebook and the photo-sharing app, Instagram, have had other issues related to race. After the European Football Championship in July, for example, three members of England’s national football team were subjected to racist abuse on the social network for failing a penalty shootout in a championship match.

Racial issues have also caused internal conflict at Facebook. In 2016, CEO Mark Zuckerberg asked employees to stop deleting the phrase “black lives matter” and replace it with “all life matters” in a public space at the company’s headquarters in Menlo Park, California. Hundreds of employees also staged a virtual strike last year to protest the company’s handling of a publication from President Donald J. Trump about the killing of George Floyd in Minneapolis.

The company later hired a vice president for civil rights and issued a civil rights audit. In its annual diversity report in July, Facebook said 4.4 percent of its US-based employees are black, up from 3.9 percent a year earlier.

Ms Groves, who left Facebook over the summer after four years, said in an interview that a series of gaffes at the company indicated that dealing with racial problems had not been a priority for its leaders.

“Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,'” she said.


Like it? Share with your friends!

0

What's Your Reaction?

hate hate
0
hate
confused confused
0
confused
fail fail
0
fail
fun fun
0
fun
geeky geeky
0
geeky
love love
0
love
lol lol
0
lol
omg omg
0
omg
win win
0
win
Joseph

0 Comments

Your email address will not be published. Required fields are marked *