Speakers say AI tools hold promise in tackling legal questions, if challenges can be overcome
There is hope that artificial intelligence applications such as ChatGPT can help level the playing field for access to justice. Still, panellists at a recent webinar on its potential agreed there are many challenges and pitfalls to overcome in using AI to replace human legal advice.
“There is a massive access to justice gap in Canada,” said Ryan Fritsch, counsel with the Law Reform Commission of Ontario, which has been looking at the implications of using ChatGPT as a legal tool.
At the webinar, organized by the Law Commission of Ontario (LCO) in cooperation with the University of Ottawa Centre for Law, Technology and Society, Fritsch noted that there is a “massive demand” for AI legal tools to help people access justice. “We always thought that AI wouldn’t replace humans until AI was as good as humans,” Fritsch said. “ChatGPT is clearly not as good as humans yet, but the demand for it is not being driven by the technology. It is being driven by the economics, and the barriers that people have accessing justice.”
He noted that the free AI tools such as ChatGPT can at least provide a starting point for those seeking legal advice. It’s good at “identifying and pointing you in the general direction,” he said, which “has a lot of value for a lot of people,” especially those who don’t know their legal rights.
Indeed, recently Law Times spoke with family lawyer Russell Alexander at Russell Alexander Collaborative Family Lawyers about the potential of AI to make lawyers more efficient and enhance access to justice for clients.
ChatGPT has only been around as a public AI tool for a few months, Fritsch said, but it already has 100 million-plus users and is the fastest-growing Internet app ever. It can generate a text response to almost any typed prompt, from children’s stories to graduate essays to family eulogies and answer questions of a legal nature.
For instance, ChatGPT could tell a layperson if they have a legal case as part of a quick legal self-assessment. It could also draft court materials and even quote case law and legislation. This could help people without lawyers generate legal documents like wills, divorce settlements, custody agreements or employment contracts.
“So GPT has a lot of promise for access to justice,” Fritsch said but the question remains "is it practical yet?”
Latest News
How ChatGPT works
ChatGPT uses a neural network called a large language module to correlate words into phrases and phrases into paragraphs to create almost instantaneous responses to what it is asked, based on 300 billion words scraped by the internet that can identify 175 billion different parameters, of which many would relate to legal questions. Some professors brought ChatGPT to exams at the University of Minnesota law faculty, where it got a C+ average on some essays and exams.
Said Fritsch: “Not bad. But is it good enough?”
He’s also worried that ChatGPT might reflect social biases that could hurt “particularly vulnerable groups relying on these tools” depending on the advice given.
For example, if a victim of domestic assault asked ChatGPT what to do, the app might give generic information about going to the police and making a complaint. However, information on how to get to a shelter or get out of a violent relationship may be what the victim needs.
“Those are the kinds of gaps that we see between the kinds of generic answers that you get from something like ChatGPT and what you get from actually going to a lawyer and actually trying to get good legal advice.”
Several employers are clamping down on the use of ChatGPT among employees, including Amazon, Verizon, Citigroup, Goldman Sachs, Wells Fargo and Accenture, according to AIBusiness.com and Tech.co.
The potential of ChatGPT and AI for legal use
Speaker Daniel Linna, the senior lecturer and director of law and technology initiatives at Northwestern Pritzker School of Law and McCormick School of Engineering, said one solution for effectively using ChatGPT, or any AI tool, in a legal context, is to “give ChatGPT and large language models the facts.”
If ChatGPT, for instance, is given a contract to analyze, it could figure out if it meets the requirements and can point to related information that could help with document revision. It also could help summarize and understand court judgements, draft patent applications, analyze legal briefs and improve writing. However, public AI systems like ChatGPT are not private and confidential information should never be inputted into the app.
It is also “incredibly good” at drafting press releases and other communications where “you don’t have specific ground truth fact that you need to have incorporated into what you’re trying to produce.”
“The potential is that it can transform the way we practise law,” he said, but “if you’re asking whether it will replace lawyers, you’re asking the wrong question.” The question is harnessing AI so that lawyers are more efficient and those representing themselves do the best legal work possible.
While ChatGPT offers a lot of intriguing opportunities, leaders should be wary of its limitations, says one expert.
Bespoke tools vs. free and public chatbots
Amy Salyzyn, associate professor at the faculty of law at the University of Ottawa, says it is essential to recognize that many of the law firms using AI as a legal tool aren’t just plugging in ChatGPT – “they are using bespoke tools built with certain type of guardrails,” and firewalls to protect client confidentiality.
“I think we’re at the beginning of seeing where this will go,” Salyzyn said, and there are still question marks about how revolutionary this use of AI will be. “But this isn’t about one simple tool. It’s a technology that’s become more powerful, and allows for natural language processing and extracting information,” she said. “We’re at the beginning of the story, and I think it’s going to be exciting to see where it goes.”
Speakers at the webinar also talked about the issues of ChatGPT related to ethics and regulation. Salyzyn said lawyers should be aware of the obligation to provide competent advice.
“A lawyer simply opening up ChatGPT on their computer and using that to funnel legal advice directly to a client is going to get into a lot of trouble,” she said. While ChatGPT getting a C+ on a law school exam is not bad, “having 20 percent of the legal answer wrong is pretty significant.”
Salyzyn also pointed out that the practice of law has evolved to a point where computerized legal research is no longer a choice but part of an efficient and fair legal system. “If there is a tool out there that has sufficient reliability that can do things much faster. . . is using that tool part of your obligation to provide efficient legal services?”
She also suggested that there could be a role for law societies to regulate such AI tools, especially for those that are aimed directly at the consumer. “Law societies are starting to look at how they can engage with these tools . . . because there is a huge public protection issue if these tools can give very confidently sounding answers that are completely inaccurate.”
Fritsch also expressed concern that a two-tier system of AI legal tools could develop, those free online and others “more tailored to addressing legal issues and providing more coherent and reliable answers.”
That could create a situation “where folks who can afford those tools show up to court, even as self-represented litigants, with better information and a stronger and possibly more compelling argument.
“There’s an access to justice here issue [in how] we make these tools available to all litigants,” he said, noting that closed legal database systems, like Westlaw, are having to open themselves up to self-represented litigants so that they have equal access to quality information when they show up in court.