Australian research shows that the right use of AI can significantly reduce gender bias in job interviews
MELBOURNE, Australia, March 29, 2022 /PRNewswire/ -- PredictiveHire is pleased to announce its R&D arm, Phai Labs, will be presenting new research on mitigating gender bias with AI at the annual conference of The Society for Industrial and Organizational Psychology (SIOP), the premier professional association for I-O psychologists in Seattle, WA on April 27-30.
The paper, "Identifying and Mitigating Gender Bias in Structured Interview Responses", presented at the symposium on "New Developments in Structured Interviews: From AI to Technical Interviews", focuses on the problem of gender bias in language and shows how this can be minimized through AI.
"I'm thrilled to share our work on how we can create a fairer playing field for everyone who applies for a job," Chief Data Scientist Dr Buddhi Jayatilleke said.
"We're in an incredibly privileged position at PredictiveHire to be able to do research like this given our large dataset of over 5 million unstructured answers to questions in job interviews - all without any identifying information."
Despite the popularity of "blind" resume screening, it's long been established that gender can be determined from the resume data and still leads to biased hiring by either interviewers or AI.
As an alternative to resumes, using just written responses to structured interview questions, Phai Labs has shown that they can significantly reduce bias with a well-defined scoring algorithm based on job-related information.
Phai Labs is able to conduct this research as candidates using PredictiveHire answer five job-related questions via text. There are no resumes or any other demographic information recorded, and the only factor considered for assessment are the written responses from candidates.
PredictiveHire's proprietary data set of "clean data" via written responses of job candidates is currently at 630 million words expected to reach 1 billion by mid year - making it the largest dataset of its kind.
The findings are important given training data with higher levels of gender information such as resumes or video can amplify biased outcomes.
The study shows when assessments are made using only interview data and using a set of well-defined scoring dimensions that even when responses carry higher levels of gender information, there is still a significant reduction in gender bias. For example, interview scoring algorithms developed by PredictiveHire using derived features related to personality, behavioral competencies and communication skills, recorded effect sizes less than 0.1 on gender (using a binary male/female classification).
The Symposium committee said in their selection notes that the work was interesting and raised many important ideas that could spark further research in this area.
The invitation to the symposium, comes on top of recent recognition by CogX, the imminent body on AI, for the work that PredictiveHire has done in bias mitigation.
About PredictiveHire
PredictiveHire's mission is to help companies unlock and engage talent at scale. Using the world's first Smart Interviewer, powered by the world's largest source of 1st party proprietary text data and advanced Natural Language Processing, we turn simple text conversations into unprecedented talent intelligence enabling organisations to interrupt hiring bias at scale, get to the right talent fast and give every candidate an experience they love.
Share this article