ADVERTISEMENT
ADVERTISEMENT

Amazon built an AI to hire people, but had to shut it down because it was discriminating against women

Amazon tried building an AI tool to help with recruiting but it showed a bias against women, Reuters reports. Engineers found the AI was unfavourable towards female candidates because it had combed through male-dominated resumes to accrue its data.

  • Amazon tried building an AI tool to help with recruiting but it showed a bias against women,
  • Engineers found the AI was unfavourable towards female candidates because it had combed through male-dominated resumes to accrue its data.
  • Amazon reportedly abandoned the project at the beginning of 2017.

Amazon worked on building an AI to help with hiring people, but the plans backfired when it discovered the system discriminated against women, Reuters reports.

Citing five sources, Reuters said Amazon set up an engineering team in Edinburgh, Scotland in 2014 to find a way to automate its recruitment.

They created 500 computer models to trawl through past candidates' resumes and pick up on around 50,000 key terms. The system would crawl the web to recommend candidates.

ADVERTISEMENT

"They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those," one source told Reuters.

A year later, however, the engineers noticed something troubling about their engine — it didn't like women. This was apparently because the AI combed through predominantly male resumes submitted to Amazon over a 10-year period to accrue data about who to hire.

Consequently, the AI concluded that men were preferable. It downgraded resumes containing the words "women's," and filtered out candidates who'd attended two women's only colleges.

Amazon's engineers tweaked the system to remedy these particular forms of bias, but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.

Gender bias was not the only problem, Reuters' sources said. The computer programs also spat out candidates who were unqualified for the position.

ADVERTISEMENT

Remedying algorithmic bias is a thorny issue, because algorithms can pick up on unconscious human bias. In 2016, ProPublica found a risk-assessment software used to forecast which criminals are most likely to reoffend exhibited racial bias against black people. Over-reliance on AI for things like recruitment, credit-scoring, and parole judgements have also created issues in the past.

Amazon reportedly abandoned the AI recruitment project by the beginning of last year after executives lost faith in it. Reuters' sources said that Amazon recruiters looked at recommendations generated by the AI, but never relied solely on its judgement.

Amazon declined to comment when approached by Business Insider, but said it is committed to workplace diversity and equality.

Enhance Your Pulse News Experience!

Get rewards worth up to $20 when selected to participate in our exclusive focus group. Your input will help us to make informed decisions that align with your needs and preferences.

I've got feedback!

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.ng

ADVERTISEMENT
ADVERTISEMENT