Subscribe to Mic Daily
We’ll send you a rundown of the top five stories every day
Amazon’s recruiting software that ranked men higher than women is a lesson in AI bias
Daniel Thorpe, of Hoboken, New Jersey, uses the Amazon app to pay for his purchase at the Amazon 4-star store in the Soho neighborhood of New York. Mary Altaffer/AP

Amazon created an artificial intelligence tool to automate its ratings of job applicants. The tool was scrapped because it routinely gave lower scores to applications belonging to women, according to the Daily Beast.

Amazon’s AI technology was trained to rank applicants based on certain patterns found in their resumes. Rankings were doled out on a one to five star scale. To train the software, Amazon used 10 years worth of resume data. Unfortunately, most of the resumes belonged to male applicants.

The Daily Beast noted that the AI penalized resumes containing the word “women’s.” Resumes that listed certain women’s colleges were also docked points.

In a statement sent to Mic by an Amazon spokesperson, the company said, “This was never used by Amazon recruiters to evaluate candidates.”

Software creators inadvertently build biases into artificial intelligence technology. Studies show that most people believe that tech tends to amplify biases instead of eliminating them.

These issues can manifest in complex ways, like facial recognition that doesn’t see black faces. As an experiment, MIT’s Media Lab explored how facial recognition technology struggled to detect faces belonging to people with dark skin. In a previous interview with Mic, Brian Brackeen, founder and former CEO of the facial recognition company Kairos, explained why this particular phenomenon occurs.

“Many of these algorithms start in universities, where they use students on campus as data for initial training,” Brackeen said. “If it only sees 12 faces of African descent and 1,000 people of European descent, it will become very adept at detecting European faces, more so than African. The algorithm itself isn’t essentially racist so much as the training.”

There are other recent instances where AI has exhibited bias toward certain groups. In July, Amazon’s face detection software Rekognition mistook 28 members of Congress for former criminals. Of those 28 members, nearly 40% of the “matches” were people of color, despite POCs only making up 20% of Congress.

If Amazon ever brings the scrapped resume-ranking software back to life, handing it 50 resumes from men and 50 from women could lead to smarter software.

Oct 17, 2018 3:15 p.m. ET: This story has been updated.