An algorithmic recruiter meant to help Amazon find top talent was systematically biased against women, a Reuters investigation found.
Why it matters: This is a textbook example of algorithmic bias. By learning from and emulating human behavior, a machine ended up as prejudiced as the people it replaced.
The details: Amazon’s experiment, which dates back to 2014, was trained on 10 years of job applications, most of which came from men, reports Reuters’ Jeffrey Dastin.
* The system concluded that men were better candidates for technical jobs.
* In 2015, Amazon began to realize that the system was penalizing resumes that included the word “women’s” (as in a women’s sports team or all-women’s colleges).