Over the past few years, Google’s algorithm has been called out by users and software engineers who find certain elements of the search engine tinged with racism.
You may recall back in 2015 how the image recognition algorithms in Google Photos were classifying blacks as “gorillas.” Google said it was “appalled” at the mistake, and promised to fix the problem. But, as a follow-up report from Wired shows that nearly three years on and Google hasn’t really fixed anything.
There was also a time when if you did a Google image search for “Black Women” or “Black Men,” the results women be littered with blonde-haired, white faces.
Following the backlash, Google claimed it “fixed” these issues by restricting its AI recognition in various racial categories. Now, searching for “black man” or “black woman,” will often return pictures of people in black and white, sorted by gender but not race, per The Verge.
And it’s not just black folks and gorillas that Google finds challenging to differentiate. Black celebrities can’t catch a break either…. just ask Chaka Khan.
Last year, the iconic singer slammed Google for confusing her for a man, peep the image above.
When Chaka caught word of this nonsense, she called out Google on Instagram, writing: “Dear @Google, WTF?!?! AND you putting my business all out in these streets??? Nick is giving you side eye from Heaven!”
Fans find the caucacity of Google hilarious, with one noting: “I guess they TRULY believe all black people look alike lol”.
Another observed: “After all this time you’d think they would fix that but they haven’t as of 4/2/2018.”
Peep a few more fan reactions below.
And peep Chaka’s response:
Article Courtesy of EURweb
First Picture Courtesy of Randy Holmes and Getty Images
Second Picture Courtesy of Steve Mack and Getty Images
Gif, Third Picture, and First through Fifth Tweet Courtesy of Twitter and EURweb
Fourth Picture Courtesy of Instagram and EURweb