Close Search
 
MEDIA, JOBS & RESOURCES for the COMMON GOOD

AI in recruiting, good or bad for diversity?


3 March 2022 at 4:07 pm
Maggie Coggan
New research has found there is potential for AI to make recruitment fairer, but that hiring professional and tech developers needed the skills and know-how to do it first


Maggie Coggan | 3 March 2022 at 4:07 pm


0 Comments


 Print
AI in recruiting, good or bad for diversity?
3 March 2022 at 4:07 pm

New research has found there is potential for AI to make recruitment fairer, but that hiring professional and tech developers needed the skills and know-how to do it first

Artificial intelligence (AI) is being used across Australia by recruiters to find new employees. 

But new research from Monash University, Diversity Council Australia (DCA) and recruitment agency Hudson RPO, has found that many recruiters are actually unsure how to use their AI to reduce bias. 

How are recruiters using AI technology?

They are using it for many things, including: 

    • natural language processing software to analyse a candidate’s personality and values; 
    • social media analysis; 
    • analysis of video interviews such as analysis of face expression, sound of voice, and emotions; 
    • video resumes; 
    • software that screens resumes for keywords; 
    • shortlisting candidates based on keywords;
    • anonymised resumes;
    • skills-based assessment; 
    • game-based recruitment;
    • chat bots to communicate and provide feedback to candidates;
    • reference checking systems; and
    • embedded candidate management systems. 

The research, which is the first part in a series of studies exploring the impact of unconscious bias on recruitment, found there was potential for AI to make recruitment fairer, but that hiring professional and tech developers needed the skills and know-how to do it first. 

“Developers say AI can remove bias from recruitment to drive workplace diversity and inclusion, but the research heard recruiting professionals were unsure how to customise the tools they use, which left them uncertain about their impact on bias in recruitment,” the CEO of DCA, Lisa Annese said.  

And it’s not just up to recruiters to fix the problem. Annese said that it was also important for tech developers to apply diversity and inclusion principles during the design and testing of these tools.  

So how can they be supported? 

Lead researcher, Monash University Professor Andreas Leibbrandt, said that if AI tools were going to disrupt biases in recruitment, users need to have at least a basic understanding of its functioning, limitations, and strengths.   

“To generate this level of understanding, it is crucial that AI developers and providers, HR leaders and recruiters, and social scientists work together,” Leibbrandt said.   

Kimberley Hubble, Hudson RPO CEO, said that something recruiters could do was be clear about talent acquisition strategy, which roles may benefit from an AI-based tool and where in the process you would use it.   

“Like any technology, it isn’t about using AI for AI’s sake,” Hubble said.    

“Recruitment and HR leaders need to focus on driving outcomes and then working out where AI fits within this framework.” 

If you’re interested in checking out the full research, head here.


Maggie Coggan  |  Journalist  |  @MaggieCoggan

Maggie Coggan is a journalist at Pro Bono News covering the social sector.


Get more stories like this

FREE SOCIAL
SECTOR NEWS


YOU MAY ALSO LIKE

Nurturing positivity in your social care role

Jenny Rosser

Friday, 22nd November 2024 at 9:00 am

5 Trends Influencing Hiring Recruitment Strategies Within Health Organisations

Johnson Recruitment

Friday, 15th November 2024 at 9:00 am

How Personality Shapes Perception and Influences Leadership

Jenny Rosser

Thursday, 7th November 2024 at 9:00 am

A Sideways Career Step Can Be A Step Up

Deborah Wilson

Friday, 1st November 2024 at 9:00 am

pba inverse logo
Subscribe Twitter Facebook
×