A company providing software using artificial intelligence ("AI") to hire language tutors will pay $365,000 to job candidates to settle a discrimination case filed by the Equal Employment Opportunity Commission.
The EEOC alleged that the software in question, created by iTutorGroup, used an algorithm that automatically excluded female candidates aged 55 or older, and male candidates aged 60 or older, based on date of birth. This exclusion would be a blatant violation of the Age Discrimination in Employment Act of 1967.
While the issues with the iTutorgroup algorithm were clear to spot in this particular case, potential discrimination in algorithms is not always this obvious. AI relies on historical data, which can reflect historic patterns of exclusion and can therefore perpetuate biases. Amazon learned this in 2018 when it realized that their AI hiring tool was not gender-neutral and favored male candidates.
As AI discrimination cases become more commonplace, employers should make sure they understand how AI is used in any software that they implement as part of their recruitment process.