Google has fixed a problem with their Photos app which identified black people as gorillas by removing gorillas from the rogue algorithm.
It isn’t the first time that Google Photos has been labelled racist as back 2015 a serious tagging failure was posted on Twitter when again black people were labelled as gorillas.
The machine-learning algorithm recognises images by establishing defining features and matching them against their extensive database but the racists blunder is a huge setback for Google Photos.
At the time, a Google spokesperson said: “We’re appalled and genuinely sorry that this happened.”
“There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”
And it's only photos I have with her it's doing this with (results truncated b/c personal): pic.twitter.com/h7MTXd3wgo
— Jacky Alciné (@jackyalcine) 29 June 2015
The debacle also saw then-Google engineer Yonatan Zunger scrambling to re-assure Jacky Alciné over Twitter that there would be a fix.
“Sheesh. High on my list of bugs you *never* want to see happen.”
“Searches are mostly fixed,” he went on.
“We’re also working on longer-term fixes around both linguistics (words to be careful about in photos of people.”
The heavy-fisted approach of removing gorilla tagging altogether means that primates such as baboons, gibbons, and marmosets were accurately identified and tagged.
But it emerged that gorillas and chimpanzees weren’t being recognised at all.
The site also revealed that searching for “black woman” or “black man” wouldn’t identify people by race, but clothing.