Facebook said Friday it had disabled its topic recommender feature after it mistook black men for “primates” in video on the social network. A Facebook spokesperson called it a “clearly unacceptable error” and said the recommendation software had been taken offline.
“We apologize to anyone who saw these offensive recommendations,” Facebook said in response to an AFP investigation.
“We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the root cause and prevent it from happening again.”
Facial recognition software has been criticized by civil rights advocates for accurately pointing out problems, especially when it comes to people who are not white.
Facebook users who watched a British tabloid video featuring black men in recent days were presented with an auto-generated prompt asking if they wanted to “keep watching primate videos,” according to the NewsMadura.
uh. This “keep seeing” prompt is unacceptable, @Facebook. And despite the video being over a year old, a friend got this prompt yesterday. Friends on FB, please escalate. This is huge. pic.twitter.com/vEHdnvF8ui
— Darci Groves (@tweetsbydarci) September 2, 2021
The video in question from the June 2020, posted by the Daily Mail, is titled “White man calls police on black men in marina.”
Although humans are among the many species in the primate family, the video had nothing to do with monkeys, chimpanzees, or gorillas.
A screenshot of the recommendation was: shared on Twitter by former Facebook content design manager Darci Groves.
“This ‘keep seeing’ prompt is unacceptable,” tweeted Groves, who addressed the message to former Facebook colleagues.