Could an algorithm replace HR?

Can programs really make better hiring decisions than humans? One software company is certain that’s the case.

Turnover at one of Adventist HealthCare’s hospitals, in Maryland, was higher than the market average, costing the group money and, if the research is right, possibly threatening the quality of care. The hospital had recruited well-trained people with all the right credentials and promptly lost the ones who just didn’t click.

What went wrong?

Replacing a lost worker costs companies an average of six to nine months’ salary, between recruitment and training costs and productivity losses, according to some estimates. “If you can keep them for two years instead of one, you saved yourself $6,000 to $20,000,” said Bill Robertson, who was chief executive officer of Adventist at the time, about five years ago. That savings goes up the longer the employee sticks around. If companies could, they would hire people who never quit—or had to be fired.

That’s where algorithms come in. Over the past few years, the hiring-algorithm space has become crowded with companies promising to “hire better than a human.” In the recruitment- software world, better comes in many flavours, but most companies don’t look at post-hire metrics. An algorithm, a formula that takes in data and spits out a result, might fill the slot faster or find more diverse candidates. But how did those people do in their job? And are they still there one, two, three years down the road?

How about forever?

“I don’t think it’s very common to connect pre-hire and post-hire together,” said Kieren Snyder, co-founder and CEO of Textio, which optimizes job postings for companies including Starbucks and Barclays. Textio is developing technology to bridge the gap.

Robertson wanted an algorithm for Adventist that could identify people who “best fit within our organization.” He turned to a private Baltimore hiring-algorithm company called Pegged, launched in 2009. While some companies use algorithms to predict when an employee will want to leave, Pegged promises to hire people who will never get to that point. Consultants such as Aon Hewitt and Towers Watson make similar claims.

After what Robertson called a “rough” adjustment period, when the algorithms weren’t showing results, Adventist saw “significant reductions in turnover,” he said, putting it at somewhere between a 30 percent and 50 percent reduction. “It was worth millions of dollars in terms of reduced productivity and the cost of hiring,” he said. Robertson has since moved on to a new health group, MultiCare, where he recently implemented Pegged.

The data Pegged uses come from three different buckets: public records, including anything you can Google; background information, such as résumés and credentials; and interaction data drawn from prospective employees’ applications. By measuring and feeding into its algorithms a job candidate’s keystrokes, how many seconds she spends on a page, and whether and when she closes a browser tab, Pegged says, it can learn about how she might perform. It narrows down the applicants to the handful who have the highest chance of succeeding in a given job, according to its algorithms. Hiring managers take it from there.

To test how people might react in a high-stress situation, for example, Pegged will throw a calculus question at someone who might not have a background in math, and then measure his reaction. Does he freeze up? Exit the page? Enter an answer and then change it?

All those metrics taken together create a big, refined data set that Pegged compares with a possible outcome, such as the likelihood an applicant will be a long-term employee. Pegged says its median impact is reducing turnover 38 percent over a 6- to 12-month period and that the worst it’s ever done in that time frame was to reduce turnover 13.5 percent for a specific job at a specific employer. Pegged says it knows what makes a successful employee by amassing a trove of data on people a company has hired or fired in the past and on those who have quit.

At the 310 facilities that have used its algorithms, said founder and CEO Michael Rosenbaum, the company has never failed to improve retention—although he would share data from only two facilities. One health-care system, Metroplex, saw turnover rate reductions between 15 percent and 60 percent in certain job categories, according to Pegged’s data.  

Pegged, which has 25 employees and says it processes more than 3 million job applications a year, is one of the more seasoned players in the space—though only six years in, it’s impossible to know how many of the hires it’s helped companies make will stay on for decades. It works mostly in the health- care sector, one of the country’s biggest and fastest-growing employers, and one that’s seen a steady rise in hospital staff turnover since 2011.

But the algorithms are versatile, Rosenbaum said. Pegged grew out of Catalyst, which helps hire software engineers for companies such as Nike and Starwood Hotels, and hospitals themselves have a variety of workers beyond health- care professionals.

Regional One, a Memphis health-care system with about 3,000 employees, went to Pegged last year for help hiring in 10 job categories, from various types of nurses to housekeepers. In seven of those 10 job types, the algorithm ended attrition over a 60-day period, according to data provided by Pegged. Over a 90-day period it ended turnover in five of eight categories. (Regional One had asked that attrition be measured over that time frame; Pegged generally looks at turnover from three months to a year.)

The algorithms get smarter over time, as more people go through the hiring, firing, and quitting processes. The factors that determine success vary not just with the job type but also with the department and the location. Institutions hiring for exactly the same jobs at different facilities might not need the same kinds of employees. At one health-care group, Pegged found that what predicted success in the job in an acute-care facility—being a leader in a community organization—predicted failure in the same job in a long-term-care facility. It isn’t clear why.

That can be frustrating for clients, who want simple, solid answers on what makes a good candidate.

“In some ways it acts like a black box,” said Robertson, the former Adventist CEO. Algorithms rely on a number of factors to determine the right person for the job. “That’s uncomfortable for people,” he said. “We want to be able to say, ‘They answered this way, therefore they’re a good fit.’ You can’t get those kinds of answers directly out of the approach that Pegged uses.”

Highlighting those complex and often counterintuitive findings is what makes algorithms better at hiring than humans, Rosenbaum said. “There is subconscious bias” in human hiring, he said. 

People tend to hire people with similar tastes, for one thing. Research has also found that people with “black-sounding” names might have a harder time finding jobs, and that looks matter, too. We have biases for what skills we think might make someone good at a job. Experience, for example, isn’t always a predictor of success, Pegged has found.

Sceptics think algorithms miss an essential part of hiring. Can a computer really understand human chemistry?

“As humans, we trust our gut so often,” said Anthony Boyce, an associate partner at human resources consultant Aon Hewitt. “And you think of the classic hiring manager: ‘I just like the cut of his jib, and I know he’s going to be a top performer.’”

More like this:

Are your annoying employees actually CIA spies?

Employment law: Does the three strikes rule really apply?

Should HRD have resigned over solicitation charge?