For the legal profession itself, AI represents both a threat and an opportunity. It could lead to a “savage reduction” in jobs for humans, according to a 2021 report from the UK’s Law Society. And a study this year from the universities of Pennsylvania, New York and Princeton estimated that the legal sector was the industry most likely to be impacted by AI.
At the same time, AI can play a hugely valuable role in researching and putting cases together. Although there is precedent for things going horribly wrong.
New York lawyer Steven Schwartz found himself facing his own court hearing this year, when he used popular AI system ChatGPT to research precedents for a case involving a man suing an airline over personal injury. Six of the seven cases he used had been completely made up by the AI.
While that may have left many law firms reluctant to embrace such systems, Ben Allgrove, the chief innovation officer at international law firm Baker McKenzie, has a different interpretation. Mr Allgrove thinks that the vast majority of AI usage in his firm will come from using the new AI-powered versions of existing legal software providers, like LexisNexis and Microsoft’s 365 Solution for Legal.
LexisNexis launched its AI platform back in May, which can answer legal questions, generate documents and summarise legal issues. The caveat is that currently, premium, paid-for versions of such tools are expensive. The alternative is for law firms to pay a lesser amount to access AI systems not specifically aimed at the legal market, such as Google’s Bard, Meta’s Llama, and OpenAI’s ChatGPT. The firms would plug into such platforms, and adapt them for their own legal use.
Baker McKenzie is already testing several. “We are going out to the market and saying we want to test the performance of these models,” says Mr Allgrove. Testing is crucial because all the current systems will make errors.
Alex Monaco is an employment lawyer who runs both his own solicitor practice and a tech firm called Grapple. Grapple was developed to provide members of the public with what Mr Monaco calls “an ontology of employment law”, and offers advice on a range of workplace issues from bullying and harassment to redundancy. It can generate legal letters and provide summaries of cases. He is excited about the potential for AI to democratise the legal profession. “Probably 95% of the inquiries that we get are from people who just cannot afford lawyers,” says Mr Monaco.
But thanks to widely available free AI tools, people can now build their own legal cases. Anyone with an internet connection can use Bard or ChatGPT to help formulate a legal letter. And while it might not be as good as a letter written by a lawyer, it is free.
It seems to me that, since the law is generally built on the foundation of precedent then AI as a search tool to sift through millions of documents is a good idea but if the used AI (or someone else’s AI) “invents” its own precedents then the law will be on shaky ground indeed - will there be an AI to check legal AI?