'It all comes down to what data are you relying on,' says expert
As artificial intelligence makes its way to the recruitment scene, there have been growing concerns that it could be just as biased as the hiring process that it vowed to improve.
But for Barb Hyman, CEO of Sapia.ai, AI-based recruitment could only be biased depending on the data used for the tool.
"If you build a model to assess and evaluate someone for a job and it's based on human data, you'll end up with biased outcomes; if you use resume data, you'll end up with biased outcomes," she told HRD.
Hyman cited the case of Amazon's AI recruiting tool in 2018 that was discovered to prefer male candidates over female ones.
"Well, it's only based on resume data and, guess what, resume data is inherently biased," she said. "All that resume does is just record signals that are very biased when often what people care about is your values and your character."
According to Hyman, resume data is what employers have had for decades, but it doesn't have a lot of utility when hiring for culture and values. It also has a lot of information that cannot be removed from a biased perspective.
"So that's where the risk is, it all comes down to what data are you relying on to assess whether that person's data profile fits this one or not," she said.
The Diversity Council Australia (DCA) released a set of guidelines last year to help employers ensure that their recruitment process remains inclusive while using AI tools.
It comes after their research warned that AI has the potential to mirror society's inequities, especially when it's not designed with diversity in mind.
According to DCA CEO Lisa Annese, AI recruitment tools can still support diversity when used correctly.
"It's very difficult to find a tool that does everything, that lacks any kind of bias or that is going to deliver 100% inclusive outcomes across a range of demographic groups," she previously told HRD.
"So the more information you have about what kind of tool you're using — How is it designed? Who designed it? What are the algorithms that it uses? What is the data set that it's been trained on? How is does that data set get applied? — once you understand all of that, you can make a more informed decision."
The discussion over biases comes as AI's role in recruitment expands from mere automation to potentially improving diversity outcomes.
"The ability to shift almost immediately the diversity of your hires is quite incredible with AI," Hyman said.
Sapia.ai is among the organisations introducing AI tools that employers can use to screen, interview, and assess applications while minimising biases.
"It's AI-based using data that doesn't have any human data, it doesn't have any social data, it doesn't have any resume data, it's a very clean data set," she explained. "You're able to really deliver quite amazing outcomes in terms of improved diversity through building models based on that kind of data set."
According to Hyman, other benefits that can be derived from using AI tools in recruitment include better efficiency, turnover reduction, as well as making sure that an employer's brand remains positive despite rejecting a candidate.
She underscored that those who refuse to use AI in the future will likely be left behind because of the technology's transformative nature.
"Transformative not just for your team — transformative for diversity, transformative for your culture, transformative for your brand," she said. "You're going to be left behind if you don't start to embrace the opportunities from AI. It surrounds us."
The main thing is get educated and start to build your own AI fluency, Hyman said. "Get educated so that you don't create risk."