A Google Image Search Result for Two Black Men Further Proves Racial Bias in Algorithms

Impact

Google has been accused of racial bias in its search algorithm in the past. Searches for "black teenagers" yielded photos of mugshots, "beautiful dreadlocks" displayed mostly white people and "unprofessional hairstyles for work" showed mostly black women with natural hair. And now, Google's best guess for a photo of two black men is "gang." 

As BuzzFeed's David Mack pointed out, when he dropped an image of two black men in the Google Image search, the site's best guess was "gang." 

The photo was from Share the Safety, a website about a gun-sharing program for "at-risk neighborhoods" that some believe to be a hoax. 

When I Google Image searched the image Mack did, the best guess was also "gang." 

Mic

The original photo was taken by photographer Victor Torres and features a black man and white man. I put the nearly identical photo — same graffiti background — into Google Image search, and the best guess for the image was "art."

Mic

Maybe it's the graffiti that triggered the result? When I put an image of two white women in front of a graffiti backdrop into Google Image search, the best guess was "best friends aesthetic." 

Mic

Algorithms are not free from human bias: People create and maintain them, and algorithms and online search results can "reflect people's attitudes and behaviors," the New York Times reported. 

People's effect on algorithms was evident in reports from Gizmodo that Facebook's trending news algorithm is influenced by human input and the way algorithms reflect people's online behavior was evident when Microsoft's chatbot Tay got its plug pulled after it evolved into a racist Nazi. 

A recent investigation by ProPublica also revealed that algorithms used by the criminal justice system are easier on white defendants — further proving that data can be influenced or amplified based on human prejudices. 

"From a machine learning perspective, if you don't think about gender inclusiveness, then oftentimes the inferences that get made are biased towards the majority group—in this case, affluent white males," professor at Oregon State University's School of Electrical Engineering and Computer Science Margaret Burnett told Bloomberg. "If un-diverse stuff goes in, then closed-minded, inside-the-box, not-very-good results come out."

To prevent racial and gender bias from creeping into our software, perhaps the humans creating it need to be more diverse than the predominately white male Silicon Valley.  

Mic reached out to Google for comment and will update with a response.

Read more: