Singapore lawyer Kui Bao (Johnathan) Lee shares practical advice for employers adopting AI in recruitment
Artificial intelligence is steadily entering the hiring process, promising greater speed and efficiency in recruitment. However, as more companies explore its use, concerns around legal compliance and ethical accountability are becoming more pronounced.
In a report by CNA, analysts said that companies have a duty to ensure employees are not unnecessarily displaced by AI. This aligns with findings from Randstad’s latest Workmonitor survey, where only 54% of employees in Singapore said their employers had helped them future-proof their skills.
This disconnect reflects a broader challenge for employers: adopting AI tools to improve hiring outcomes while upholding fairness, transparency and regulatory compliance. Kui Bao (Johnathan) Lee, advocate and solicitor at Fong & Fong LLC, has advised companies on the responsible use of AI in recruitment.
“I went to ask some of my clients, ‘Hey, is your company using AI for hiring?’ Most of them say no, except for one who uses it for scheduling, so auxiliary matters,” said Lee.
“They’re using it as a complement, but not to replace, which is a good sign.”
Lee acknowledged the potential of AI for improving efficiency and legal compliance in areas like job description writing and policy review. “There are huge opportunities in terms of efficiency and productivity,” he said.
“From your policies to your job descriptions to how you interact with possible candidates, adopting a consistent voice… I think that's also part of AI's capability, provided that you give the right prompts.”
Lee shared that his clients, mostly small and medium-sized enterprises, remain cautious about using AI to assess candidates.
“They still want to look through all the resumes, eyeball them. It's a matter of trust. They trust themselves more than AI,” he said.
“It also has to do with education. Even though [AI] has been around for some time, not a large population of people are trained in using it properly,” he added, a point highlighted by Randstad’s latest survey.
He added that AI tools are not yet capable of assessing certain human traits that matter in hiring. “Qualifications can mean smarts, ‘book smarts,’ but [for] front-facing roles, there's an EQ there that AI can’t discern because it's still robotic. It can mimic human language, but it's still not yet that empathetic.”
“AI is here to stay,” he added. “So, it would be prudent for employers to adopt AI, learn quickly, fail quickly too… but learn from the mistakes, and have AI do the menial tasks. For the policy, the thinking, the strategic tasks, I think so far, only the human can decide.”
AI’s limitations present challenges under Singapore’s existing legal framework, especially when it comes to workplace fairness.
“Singaporeans are pragmatic. We also pride ourselves on the rule of law, and we want to comply, and we know the importance of complying,” said Lee. “AI presents a unique problem… because the AI models at this point in time [are] mostly trained on Western models.”
“So naturally, you would suspect biases towards the Western culture of doing things,” he added, cautioning HR teams to be mindful of unintended discrimination when using off-the-shelf tools.
Lee emphasized that the legal consequences of misusing AI remain with the organization.
“Any breach will be pinned on the human user,” he said. “At the end of the day, the law places the human accountable, not the AI.”
He urged employers to avoid treating recruitment as a purely technical function. “If the company appears to be treating hiring in a very robotic way, then what does that mean for an employee right now?”
He also raised the risk of scope if AI is extended beyond recruitment. “If AI is used for recruitment, HR professionals will use it for work performance assessment, for promotion, for firing, for retrenchment.”
Lee noted that while the Personal Data Protection Act (PDPA) currently does not directly address AI, employers must still apply its principles when using such tools.
“You've got to make sure that your data, how you use the data, is clearly defined,” he said. “Even if it's internal AI systems, it will be good practice to tell the job applicant, saying, ‘Look, we are going to use AI systems. Your data is going to be given to AI to assess.’”
He also cautioned against the silent use of applicant data for AI training. “I don't think it will be right if you don't tell the applicant that your data may be used to train AI as well.”
Meanwhile, employers using third-party providers should treat them as data intermediaries and ensure their contracts reflect this. “If you are going to transfer that data, then you have to ensure that that third party complies with existing laws. What type of contract do you have for them? How do you ensure that they will safeguard the data as part of compliance?”
“There’s so much personal data in a simple resume,” Lee said. “You've got phone numbers, you've got your pictures, names, addresses… And data is the new gold.”
Lee stressed the need for thorough documentation throughout the AI-assisted hiring process.
“Always leave a paper trail because, at the end of the day, AI is a tool. And it is what you put in the tool and how you use the tool that is absolutely important,” he said. “Whatever it is, how you use the tool… it must stand up to scrutiny.”
Even after AI shortlists candidates, human review is critical, as reported by CNA. “The person hiring, the HR director, the people in charge of recruitment, must then also not just rely on the score… and exercise all due diligence.”
Lee said employers should also account for those excluded by AI screening. “There also must be a person to eyeball those people who have not been selected through AI.”
He added that strong systems upfront can mitigate risks and reduce future exposure. “You’ve got to front-load your effort. You’ve got to front-load your time, your money,” he said. “But once all these are set up, there will be huge cost savings… preventing future loss.”
Lee’s final advice is simple: don’t wait to involve legal counsel.
“Companies will be wise to take on board legal counsel and really scrutinise its processes and development,” he said.
“Engage the legal counsel as it's developing and preparing to use AI and ensure that all the processes at every step of the way will be compliant.”