While it may be a time-saver for some tasks, it's unlikely to replace a human being, says academic
According to a recent report, adoption of generative artificial intelligence (GenAI) in Australia could unlock tens of billions of dollars in economic value by 2030.
The claims - in Australia’s Generative AI Opportunity from the Tech Council of Australia - are based on accelerating responsible adoption of the technology. The report say that GenAI could automate 44% of Australian workers’ tasks and contribute as much as $115bn a year to Australia’s economy by 2030 by improving existing industries and enabling the creation of new products and services.
But where does the HR role fit into this – how much is there to be gained in such a people-orientated role?
“From an HR perspective, you still need that people focus, and any AI system will need human oversight,” says Martin Edwards, associate professor at University of Queensland Business School.
“Increasingly, various legislation is suggesting any AI system that produces, analyses and works with employee data will need oversight to make sure that, for example, output or recommendations it produces aren’t biased.”
While algorithmic AI systems have been growing in use in HR for tasks such as processing CVs to match against job specifications, generative AI - which produces text-based output – has quite a different scope in the role, says Edwards. For this reason, it lends itself more to admin-based work that produces text-related output.
It could also be useful in helping to produce reports, he says, but HR professionals should exercise caution.
“There's been quite a discussion about people using these in place of actually writing their own material,” says Edwards.
Latest News
Problems with accuracy
There have also been problems with the accuracy of information included if, for instance, a GenAI system trawls the internet to find information for a report, he says.
“There are problems apparently with GenAI actually hallucinating and making up facts, and also if referencing research, then sometimes they’re just making it up. So one wouldn't be able to assume that the facts and the references it’s linking to were actually accurate. Somebody would have to check that that.”
There can certainly be time-saving benefits to using it though, says Edwards.
“For example, if you’re in an HR analytics team, you could put your data into one of these systems and ask it to produce a report and a set of graphs and visualisations. You could request that it describes the data in some analysis in a particular way. That could then help feed into a report that somebody might be trying to build. You might also perhaps ask it to write you something like a communication email to staff welcoming a new member of the executive team.”
GenAI could be used in preparation for presentations too, so there’s plenty of potential for time-savings in HR activities, he says.
Potential for bias with GenAI
However, the human oversight is also essential because algorithms are prone to different types of bias, say Edwards.
Even for use in generating job descriptions, there is this potential for bias, he says.
“If you ask a generative AI system to draft a job description for an IT systems support role, for instance, that generative AI system will no doubt draw from data from the internet. There's a potential for bias in what it produces if there's a bias in the job descriptions in the information it’s drawing from.”
Where it can be a great tool, he says, is if you have a blank sheet of paper that you need to produce something for because it can trawl through massive amounts of data from the internet to produce a potential first stab at this.
“It will still require a human being at the moment to be interacting with that to make sure, for example, the text it produces is both applied and appropriate for the organisation, but also to check whether or not there's any kind of bias associated or inaccuracies in what’s being produced.
“Gender-neutral job descriptions is something you'd want to be checking for, or making sure the description of the roles is inclusive. A generative AI system may not to be sensitive to those things.”
So while it may be a time-saver at present, it's not going to necessarily replace a human being in that context, he says.
“Going forward we don’t know where the technology will lead,” says Edwards. “Potentially, and this is somewhat speculative, you could even feed data associated with performance evaluations into these systems and ask them to produce feedback narratives on the basis of the data that you put in. There are various things it's possible to start using these things for but we're still some way away from actually doing that.”