'Employers are inadvertently, automatically, screening out really great candidates,' says expert
A strong majority (88%) of employers believe they are losing out on highly qualified candidates who are screened out of hiring processes by applicant tracking systems because their resumes aren’t ‘ATS-friendly,’ new data from Select Software Reviews shows.
These systems are only as smart as they’re programmed to be, according to Joseph A. Allen, professor of industrial and organizational psychology at the University of Utah.
So, when a resume doesn’t match one of the programmed formats or when valuable skills are not captured in a traditional resume, these high-quality applicants may be overlooked by the system.
“The implication here is employers are inadvertently, automatically, screening out really great candidates, and top-talent individuals are going to end up at your competitor who, for whatever reason, is using a low-tech intern to identify these candidates,” Allen said.
This is when employers need to decide how big of a loss it will be if they don’t hire the top candidate for a role. For an entry level position, missing out on a top candidate, when there are many qualified applicants, may not be a big deal.
However, the issue starts to emerge when an employer receives lots of applications for a position that has important key characteristics that may or may not be captured in a traditional resume, he said.
“It’s really a cost-benefit analysis; HR managers and HR professionals need to think about ‘How do I calibrate my system to improve my decision making so I avoid making mistakes that are not necessarily mistakes, but avoid missing out on great people, while at the same time, getting some of that cost benefit of not having to sit there and look at 500 applications?’”
To maximize the use of these automated systems, HR professionals need to interrogate these systems before implementing them, and then they need to be monitored in case problems do arise, said Cari Miller, lead researcher at The Center for Inclusive Change.
“People are imperfect, and we're never going to catch everything, so a best practice in responsible AI for any system is to always have what is called an adverse incident management approach, or protocols for when something goes wrong,” she said. “You have to have a process where someone can report it, so you can update or deal with it.”
Alongside ensuring these systems are properly programmed, HR professionals can also go through the filtered applications to check that the system has done as much of its job as possible and check for common mistakes, said Subodha Kumar, professor of statistics, operations and data science at Temple University.
It is also key to understand what kinds of jobs these systems are working best for, as the system may work better depending on what skills are needed.
“With many of these kinds of systems, on average, they will certainly do a better job, they will reduce time, they will improve efficiency, no question about that,” he said. “But many times, this AI misses common sense and the intuition part, which leads them sometimes to make mistakes that humans would not do.”