Jump to content

How artificial intelligence is unmasking bias throughout the recruitment process


Recommended Posts

  • Author

How artificial intelligence is unmasking bias throughout the recruitment process

How artificial intelligence is unmasking bias throughout the recruitment process

data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///ywAAAAAAQABAAACAUwAOw==
Credit: Pixabay/CC0 Public Domain

New research from the Monash Business School has found that throughout the job recruitment process, women believe artificial intelligence assessments reduce bias, while men fear it removes an advantage.

Professor Andreas Leibbrandt, from the Department of Economics, investigated how artificial intelligence recruitment tools affect existing biases in recruitment and argued whether there was a way to dismantle the barriers that prevent underrepresented groups from reaching their full potential in achieving their desired roles.

“People in minority groups have inferior market outcomes, they earn less, they have a harder time finding and keeping a job. It’s important to understand why that is the case so that we can identify and remove the barriers,” Professor Leibbrandt said.

One major hurdle lies in the recruitment process itself, which is undergoing a shift alongside the rise of AI. “We know that a large majority of organizations now use AI in their recruitment process,” he said.

To uncover recruitment barriers, the first-of-its-kind study focused on the two key areas of applicant behavior and recruiter bias.

In one field experiment, more than 700 applicants for a web designer position were informed whether their application would be assessed by AI or by a human.

“Women were significantly more likely to complete their applications when they knew AI would be involved, while men were less likely to apply,” he said.

A second experiment focused on the behavior of 500 tech recruiters.

“We found that when recruiters knew the applicant’s gender, they consistently scored women lower than men. However, this bias completely disappeared when the applicant’s gender was hidden,” he said.

When recruiters had access to both the AI score and the applicant’s gender, there was also no gender difference in scoring.

“This finding shows us they use AI as an aid and anchor—it helps remove the gender bias in assessment.”

Professor Leibbrandt said a crucial aspect of the study was that, in contrast to the vast majority of current research, it focused on the human interaction with AI, rather than the algorithm behind it.

“My research isn’t just about dismantling bias, it’s about building a future of work where everyone has the opportunity to thrive,” he said.

Professor Leibbrandt is exploring other frontiers in the fight for workplace inclusion.

One project will test the impact of informing job applicants who are being assessed by AI about the potential bias in AI training data.

He also plans to tackle the concept of ‘narrative discrimination’ where unconscious stereotypes influence hiring decisions in the tech industry, as well as explore the potential for bias in remote work settings.

Provided by
Monash University


Citation:
How artificial intelligence is unmasking bias throughout the recruitment process (2024, October 11)
retrieved 11 October 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.




Source link

#artificial #intelligence #unmasking #bias #recruitment #process

📬Pelican News

Source Link

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

Cookie Consent & Terms We use cookies to enhance your experience on our site. By continuing to browse our website, you agree to our use of cookies as outlined in our We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.. Please review our Terms of Use, Privacy Policy, and Guidelines for more information.