Amazon has been forced to scrap its AI recruitment system after it was discovered to be biased against female applicants.

The AI was developed in 2014 by Amazon as a way of filtering out most candidates to provide the firm with the top five people for a position. In 2015 it was found that it wasn’t rating applicants in a gender-neutral way, which is a big problem and goes against Amazon’s attempts to level the playing field by having an objective AI do the early decision making.
As it turns out, the problem lay in how the system was trained. As with everything in AI, a lack of diversity in the industry led it to be trained almost entirely upon male CVs. This meant that, as it was learning to detect the patterns in recruiting over a 10-year period, it was also learning to devalue the CVs of women.
READ NEXT: What can we do about tech’s diversity problem?
According to Reuters, the system taught itself that male candidates were preferable to women. It downgraded CVs if found words such as “women’s” and penalised graduates of all-female colleges.
While Amazon recoded the software to make the AI neutral to these terms, it realised that this did not guarantee that the technology would find other methods of being discriminatory against women, the report said.
The team, set up in Amazon’s Edinburgh engineering hub, created 500 models concentrated on detailed job functions and locations. The system was also taught to recognise around 50,000 terms that showed up on past candidates’ CVs.
The technology learned to assign little importance to skills common across IT applicants, favouring terms more commonly found on male engineers’ resumes, such as “executed” and “captured,” the report said.
READ NEXT: Do diversity quotas help or hinder women in tech?
The model used in the AI system had other problems that led to unqualified candidates being recommended for a variety of unsuitable jobs.
This eventually led to Amazon pulling the plug on the team as executives “lost hope” over the project, anonymous sources told Amazon. The tool could not be solely relied upon to sort candidates.
The firm now uses a “much-watered down version” of the recruiting engine to carry out “rudimentary chores”.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.