YouTube icon

2022-10-10 16:22:54 By : Ms. Kelly ZHU

Cambridge University found AI recruitment software discriminated against candidates for spurious reasons like their home decor

Artificially intelligent recruitment programmes are discriminating against people who wear glasses or sit in front of bare walls, academics have warned, as they urged companies to stop relying on "pseudoscientific" software.

Many companies now use algorithms to sift through large numbers of candidates to help them determine not only which have the correct qualifications, but also the best personality.

AI recruitment firms claim that the programmes bypass unconscious human bias, and remove discrimination, while honing on people who are likely to be conscientious team players.

But a new analysis by Cambridge University found that they are often discriminating against people for spurious reasons, such as their home decor, clothing or lighting.

In video interviews, AI programmes tended to favour people sitting in front of bookshelves or with art on their walls. They also recommended applicants wearing head scarfs, believing them less neurotic, while judging people who wore glasses as less conscientious.

“All too often, the hiring process is oblique and confusing,” said Euan Ong, an algorithm developer at Cambridge University.

“These tools are trained to predict personality based on common patterns in images of people they’ve previously seen, and often end up finding spurious correlations between personality and apparently unrelated properties of the image.”

The researchers warned that AI is being used unscientifically to infer personality traits from minute gestures, such as head tilts, speech intonation and vocabulary.

Algorithm developers often claim their programmes can spot the “Big 5” personality traits - conscientiousness, extroversion, openness, neuroticism and agreeableness.

But the academics point out that people can demonstrate conscientiousness or agreeableness in different ways, and there is not a universal interpretation for what somebody means or intends by a gesture, or phrase.

The team said using AI to judge personality was “automated pseudoscience” reminiscent of physiognomy or phrenology - the discredited beliefs that character can be deduced from facial features and skull shape.

“We are concerned that some vendors are wrapping ‘snake oil’ products in a shiny package and selling them to unsuspecting customers,” said co-author Dr Eleanor Drage.

“While companies may not be acting in bad faith, there is little accountability for how these products are built or tested.

“As such, this technology, and the way it is marketed, could end up as dangerous sources of misinformation about how recruitment can be ‘de-biased’ and made fairer.”

A 2020 study of 500 organisations across various industries in five countries found 24 per cent of businesses have implemented AI for recruitment purposes and 56 per cent of hiring managers planned to adopt it in the next year.

Another poll of 334 leaders in human resources, conducted in April 2020, as the pandemic took hold, found that 86 per cent of organisations were incorporating new virtual technology into hiring practices. 

The rise in working from home, coupled with pandemic restrictions in offices, has also led to more companies using the software.

“This trend was already in place as the pandemic began, and the accelerated shift to online working caused by Covid-19 is likely to see greater deployment of AI tools by HR departments in future,” said co-author Dr Kerry Mackereth.

“Volume recruitment is increasingly untenable for human resources teams that are desperate for software to cut costs as well as numbers of applicants needing personal attention.”

To prove how biased the programmes are, Cambridge has created their own version, called The Personality Machine, which allows people to alter their image to see how it impacts their perceived character.

The team hopes it will help prospective candidates to “beat the algorithms” and give job seekers a taste of the kind of AI scrutiny that they are under.

The research was published in the journal Philosophy and Technology.

We rely on advertising to help fund our award-winning journalism.

We urge you to turn off your ad blocker for The Telegraph website so that you can continue to access our quality content in the future.

Thank you for your support.

Visit our adblocking instructions page.