Recruitment AI may discriminate against people with disabilities

Recruitment AI may discriminate against people with disabilities

Recruitment AI may discriminate against people with disabilitiesArtificial intelligence tools are increasingly being used in human resources, whether for sifting applications or creating job offers. However, generative AI has the potential to discriminate against individuals with impairments based only on their resumes, according to a US study.

This was discovered by University of Washington researchers following an experiment in which they asked ChatGPT-4 to provide feedback on resumes that included information indicating the writer’s status as a disabled worker. For instance, it was mentioned that the job seeker received a disability scholarship.

The researchers entered the resumes into ChatGPT-4 multiple times and compared the results with the original resume. Which had no indications of mental or physical impairment. The objective was to choose a profile for a research position at an American software company.

Just 25% of the 60 tries made by OpenAI’s chatbot indicated that the updated resumes were the greatest fit for the position.

“In a fair world, the enhanced resumé should be ranked first every time. I can’t think of a job where somebody who’s been recognized for their leadership skills, for example, shouldn’t be ranked ahead of someone with the same background who hasn’t,” said senior study author Jennifer Mankoff.

The researchers discovered that ChatGPT-4 frequently reinforced ableist attitudes when they prompted the chatbot to defend its decisions. A jobseeker with depression, for instance, would have an “additional focus on diversity, equity, and inclusion (DEI), and personal challenges.” Which “detracts from the core technical and research-oriented aspects of the role,” according to the generative AI.

In fact, according to research lead scientist Kate Glazko, “some of GPT’s descriptions would color a person’s entire CV based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resumé.”

In an effort to avoid stigmatizing workers with disabilities, the scientists then attempted to tailor ChatGPT using textual instructions. To a certain degree, this worked: out of 60 cases, 37 of the updated resumes fared better than the original. Still, generative AI remained biased against applicants suffering from autism or depression.

“People need to be aware of the system’s biases when using AI for these real-world tasks,” Glazko noted.

Contrary to what some may say, the results demonstrate that generative AI is just as prejudiced as a traditional recruiter. This is the reason the Artificial Intelligence Act, a piece of law that establishes a regulatory framework for the use of AI in Europe. It categorizes resume sorting software as belonging to “high-risk” AI systems.

Therefore, human resource professionals must exercise prudence when integrating AI software into their operations to reduce the possibility of discrimination.

READ MORE:

OpenAI Humanoid Robot Replaces BMW Assembly Plant Worker

HRD Corp fails audits, needs government intervention

Employment status can shape youth identity

Leave a Reply

Your email address will not be published. Required fields are marked *