'If organisations select the right artificial intelligence tool, they can increase things like accessibility, fairness, objectivity, and transparency': CEO
In 2023, Diversity Council Australia (DCA) released a set of guidelines to help employers address possible risks to diversity and inclusion when using artificial intelligence (AI) in recruitment.
The guidelines were part of the final stage of DCA’s three-year Inclusive AI and Work in Recruitment project.
The first phase looked at the positive and negative effects of using AI in recruitment; the second examined the current state of affairs of AI and how it supported recruitment in Australia; and the final phase was about how can organisations use AI in recruitment to assist rather than be problematic for their diversity and inclusion efforts.
“What we found was that essentially artificial intelligence recruitment tools actually mirror society's inequalities and they take in systemic biases unless you deploy them with a strong focus on diversity and inclusion,” DCA CEO Lisa Annese told HRD Australia.
AI and recruitment
Annese explained that when used correctly, AI can support diversity.
“If organisations consider diversity and they make it a priority and they select the right artificial intelligence tool, they can increase things like accessibility, fairness, objectivity, and transparency,” she said. “They can broaden the applicant pools and they can result in more diversified hires.”
However, there is always a trade-off with using AI or any other tool, she added.
“Ultimately, recruitment is all about excluding people until you find the candidate that you’re looking for,” Annese said. “So if you use an artificial intelligence tool and you've got a specific diversity objective, you can pick the right tool which will assist you and support you, but it will have other consequences. And you have to understand what those consequences are.
Latest News
“It’s very difficult to find a tool that does everything, that lacks any kind of bias or that is going to deliver 100% inclusive outcomes across a range of demographic groups. So the more information you have about what kind of tool you're using — How is it designed? Who designed it? What are the algorithms that it uses? What is the data set that it’s been trained on? How is does that data set get applied? — once you understand all of that, you can make a more informed decision.”
Annese also acknowledged that every tool will have its limitations.
DCA’s AI in recruitment guidelines
DCA’s Inclusive AI at Work in Recruitment Employer Guidelines are designed to help employers with the selection and deployment of the right AI tools. The guidelines use a five-step acronym called TREAD: Team up, reflect, educate, acquire and decide.
Team up: The first step is putting together a diverse team to leverage the diverse perspectives and expertise to identify and address any biases, Annese said.
“If you are trying to address an underrepresentation of a particular group, then make sure in that team you have people from that group so that they can ask the right questions,” she said. “That's a skill that organisations should apply to all their decision making – diverse teams are more effective in making better decisions that minimise risk.”
Reflect: This step requires organisations to reflect on their own readiness, Annese said.
“Once you've created your D&I (diversity and inclusion) impact team, you need to consider your organisational readiness with regards to AI recruitment,” Annese said. “And the two factors that you need to really focus on there are your D&I maturity – where are you on that journey of D&I experience – but also your AI maturity. You can't have one and not the other.”
Educate: The third step is around educating your team about bias in recruitment, Annese said.
“Bias plays out in recruitment in a million different ways,” she said. “You have to understand that so that you can spot it when you're picking a tool. You have to know that if I use a video interview, that these are the limitations of video interviews, this is how they might exclude people who don't make eye contact, for example, or people who live in a low socio-economic area where the internet is poor or people who might have an assistance dog in the frame when they're doing an interview.
“You need to understand when tools are deploying things like an emphasis on natural language patterns, how that might impact people for whom English is a second language or for whom they're not tertiary educated. So understanding how bias plays out in recruitment will help that team pick out the right tool.”
Acquire: “The fourth step is around acquiring expertise,” Annese said. “Once you educate your team about bias in recruitment, you need to acquire expertise on how to understand how bias impacts AI tools, specifically.
Decide: “And then finally, you'd want to decide how to proceed inclusively using an AI tool,” Annese said.
In addition to the guidelines, DCA also developed a checklist to help employers make an informed decision about a particular AI recruitment tool helps rather than hinders diversity, equity and inclusion outcomes.
And Annese emphasised the importance of HR teams building their capability and knowledge around AI so that they can select the right tools and deploy them properly.
“This is not a simple undertaking.” Annese said. “Recruitment’s one of the most important decisions any employer can make. It's worth the investment in getting it right so that you can get access to really fantastic people.”