How AI Content Generators are Changing the Legal Field
Category Machine Learning Sunday - December 3 2023, 01:58 UTC - 11 months ago AI content generators like ChatGPT are increasingly being used in the legal field to interpret the law, provide access to justice, write contracts and legal documents, offer legal aid, support decision-making, facilitate lawyer-client communications. These applications still need to be controlled responsibly, as AI can provide fake content, and in legal matters lawyers and judges will still be necessary.
AI content generators like ChatGPT will never replace lawyers and judges, but they're increasingly being used in the legal field.There's already a lot of excitement about what these tools can do in terms of interpreting the law, providing access to justice and education about legal matters, writing contracts and legal documents, offering legal aid, supporting decision-making, and facilitating lawyer-client communications.
But the arrival of ChatGPT and applications that work with it have raised two kinds of questions about how to control and use it, according to professors Nicolas Vermeys and Karim Benyekhlef of Université de Montréal's Faculty of Law.
Dependent on jurisdiction and territory .
The first issue, according to Vermeys, is that unlike medicine or other scientific fields, the law is a discipline that applies to a specific region or territory.
"For example, Canadian criminal law only applies within Canada, just like Quebec's civil law only applies within Quebec," he said.
So even though AI content generators can be trained, there's a real risk of getting the wrong legal information from the tool, "especially since Quebec is under-represented in the legal datasets used to train these algorithms," he added.
The very design of tools like ChatGPT has created another issue," said Vermeys, who also leads UdeM's Research Center in Public Law.
"ChatGPT was designed to provide the most likely answer to a question, not the best answer. That means its results could be fake. When I asked it to cite my five most important published studies, it referred me to four that didn't exist and a fifth that wasn't even mine." .
The arrival of AI content generators raises the question of how to control the content the software provides.
"Among other things, who's responsible if ChatGPT uses content protected by copyright?" asked Vermeys. "And who's responsible if these tools provide an answer that includes someone's personal information?" .
Enhanced rather than artificial intelligence .
Professor Karim Benyekhlef agrees with his colleague about the risks posed by generative AI in the field of law, but he nonetheless believes the software can be used judiciously.
"This type of tool can improve access to justice by giving people information about their rights—like the Justicebot tool that we developed at the UdeM Cyberjustice Laboratory," said Benyekhlef, the lab's director.
The lab also developed PARLe, the first online conflict resolution platform, which has directly resolved 65% of the disputes submitted to it.
"When PARLe isn't able to resolve a dispute on its own, a mediator steps in and is given access to a decision-making tool that flags derogatory statements made by the parties and suggests more polite alternatives and responses," said Benyekhlef . "That's why I think it's more appropriate to talk about enhanced intelligence rather than artificial intelligence." .
But Benyekhlef also points out that these tools and bots are useful for consumer, labor or neighborly disputes—in essence, common, low-stakes issues.
Legal experts are here to stay .
For complicated cases, lawyers and judges still need to be involved.
"Given thar AI content generators are getting better all the time, those working in the field need to make sure that the information they provide is accurate, and that mistakes are avoided," said Benyekhlef .
Share