M

We have updated our website and myADLS. To access your myADLS account, please reset your password by using the “Forgot password” option or Click Here. If you need further assistance, contact us at helpdesk@adls.org.nz or 09 871 1385.

Back Home 5 News 5 How generative AI will impact the way lawyers work

How generative AI will impact the way lawyers work

21 Jul 2023

| Author: Diana Clement

Is your graduate a cheat? As law firms begin experimenting with ChatGPT and other forms of artificial intelligence, some worry that their graduates might not have the skills their academic transcripts suggest.

ChatGPT can, in seconds, generate essays and exam answers that would have taken students hours or even days in the past.

It isn’t like a Google search, says Antonia Modkova, director of IP and innovation at SOUL MACHINES. “What it is generating is something novel that the world has not seen before.”

The issue for both law schools and employers is that with the right prompts, ChatGPT can generate answers to complex questions. It is already good enough to pass law exams, Modkova says. GPT-4, the latest iteration of the software, scored 75% in a bar exam. “Good enough to place in the 90th percentile,” she says.

For law firms wanting to hire graduates in the next few years, that could mean these students’ abilities might bear no resemblance to their official academic transcript if they’re skilled at using generative AI tools. Conversely, the graduates who can utilise the technology the best could prove more profitable in private practice.

There are many uses within law firms for the technology, Modkova says. These include proofreading, legal research, drafting documents and contracts, reviewing documents and contracts, giving legal advice, automating client correspondence, summarising client meetings and even judicial decision-making.

For example, when Lawgeex used an AI robot for reviewing contracts and tested it against 20 lawyers, the robot could pick up 94% of the risks in non-disclosure agreements (NDAs). The human lawyers picked up only 85% of those risks.

Modkova adds that generative AI doesn’t get bored, it writes faster than a human and is more scalable than a single legal expert charging by the billable hour.

On the downside, generative AI lacks common sense and understanding, it’s extremely prone to groupthink and bias, contradicts itself and can be tricked, among other failings, she says.

Academic angst
Make no mistake, graduates will need the ability to use generative AI tools. Simply knowing the law is now not enough, says Alex Sims, associate professor in the Department of Commercial Law at the University of Auckland Business School.

If they can’t use the tools, graduates will soon be too slow in their work. The technology will only get better. But in academia, AI is generating a great deal of angst. Anti-plagiarism tools such as Turnitin are scrambling to add ChatGPT detection capability. But can these tools stay ahead of the rapidly evolving technology?

Academia doesn’t yet have the answers. It’s a trying time for law schools and other departments in universities everywhere, says Sims. “It means [universities] can’t keep on doing what [they] did in the past.” New forms of assessment are needed.

Assessing students
Law schools could focus on trying to detect AI use in written material. Or academia can try to adapt to a new paradigm, which may not be a bad thing, Sims says.

She argues that tests and exams have never been good assessment vehicles, anyway. “Just because some people are no good at exams, it doesn’t mean they don’t know the law.”

As an example, Sims cites herself. She had no experience at sitting formal tests and exams when she entered university.

“When I was at law school, I wasn’t very good at exams because I went to a different type of school,” she says. “I would help [other students]. I’d explain what the law was to them because they didn’t understand it. And they ended up getting better marks than me in the exam because they were good at exams.”

The irony in academia, Sims says, is that universities are now considering assessment methods that have been superseded or ruled out in the past.

“One way that you can assess students to see whether they understand the law is in-person, handwritten assessments.” That’s assuming that good writing will still be needed in the law, as it was in the past, she says. And, of course, AI-generative tools can now do it for them, hence the need for in-person hand-written tests.

Verbal assessments could also be used to ensure students know the material. But, like hand-written assessments, these can be cumbersome and are a lot more work.

Many academics are now talking about assessing soft skills which is another irony, although positive, Sims says.

“Those so-called soft skills like presenting and talking with people were dismissed [in the past].” Academics argued that law students didn’t really need the soft skills. “Now people are going ‘how can we assess students’ soft skills?’”

Why hire juniors?
It has been suggested that law firms shouldn’t worry that their graduates might have cheated in their exams because generative AI ought to be able to replace them. Why bother with self-obsessed young employees if a tool can do the job in less time?

Stace Hammond partner Kesia Denhardt disagrees. “Presently, this new technology does not have the competence to compete with lawyers so as to render them obsolete,” says Denhardt.

In fact, Denhardt argues that generative AI has significant potential to impact the way lawyers work in a positive way, by eliminating or reducing the time and effort spent on some legal work, including drafting, research and other more routine tasks which can be automated and delegated to AI.

“However, there are key human attributes embedded in legal work which AI does not possess, including and especially critical thinking and the exercise of judgment, and the abilities to develop strategy and take account of ethical considerations.

“AI does not have the instinct, perception or sensitivity innate in lawyers, nor does it understand or have the ability to create a trusted relationship with a client. It also does not have the practical experience a lawyer has in dealing with the unpredictable.”

Collaboration
Denhardt argues that the optimum relationship between lawyers and generative AI would be collaboration.

“AI cannot and should not substitute for lawyers but could be utilised to work with lawyers as a tool to complement their skills and perform certain legal tasks more efficiently, thereby freeing them up to focus on the more human aspects of legal practice.

“By joining forces, lawyers and AI could each offer one another the capability to achieve what they cannot do alone.”

Denhardt says the flow-on effects of AI being used by the profession may lead to some law practitioners changing their business model from billing on a time- spent basis to value-billing or other alternatives. It may also result in better access to justice.

Goodbye to grunt work
Some industry thinking around law and generative AI has been at either end of the spectrum, says Sims. “On the one hand, [it’s] that grads and even lawyers are no longer needed. They are. But not as many will be needed. On the other hand, it’s that AI will have no effect on grads and lawyers because law is ‘special’.” Neither is correct.

Having a good grasp of how to get the best out of generative AI will be beneficial for graduates. Even now, they shouldn’t be doing grunt work, Sims says. “They should never be doing discovery because that’s been done by AI for a long time.

“[When] doing background research for the partner they’re working for, they need to be using the AI tools and other tools to do their work. They’ll be able to do more and, hopefully, more interesting things than they currently are.”

What’s more, the ability of technology to change legal work is not new, she adds. “About 10 years ago, I was at a law and tech conference in Sydney. At my table was a lawyer in his 60s. He said he had changed his practice. He now had seven younger people working for him and was getting through more work than when he had 21 lawyers.” The only difference is that the pace of change is becoming far more rapid.

Law firms have a couple of years up their sleeves before AI ‘cheats’ start applying for jobs. Students finishing this year or next have probably done most of their work and assessments without the help of ChatGPT, Sims says.

Going forward, the answer to worries about so-called cheating could be quite simple. Law firms could reduce their reliance on official academic transcripts and focus more on students proving themselves as interns or summer clerks, she says. It’s a try-before-you-buy approach to employment.

Another alternative might be to use interviews to assess the graduate’s legal knowledge.

It might, in fact, mean those destined to be put out to pasture might be those doing the interviewing, if they can’t keep up with the technology.

“I do know that there are senior lawyers who are very much looking forward to retiring or retiring slightly early because they can see what’s coming,” Sims says.

On the other hand, while partners may be wringing their hands, the technology will prove useful – at least for those with the skills to harness it and ensure that results are reliable.

For example, Andrew Perlman, dean of Suffolk University Law School, wrote a 14-page legal article in one hour by collaborating with ChatGPT. Summarising the experience, he said: “For the legal industry, ChatGPT may portend an even more momentous shift than the advent of the internet.”

In the right hands, generative AI can be more reliable than lawyers and is highlighting that humans make just as many or more mistakes than machines, Sims says. In the future, she believes it will be negligent not to use AI.

Humans make mistakes
As it stands, ChatGPT has a growing reputation for making up references. In the wrong hands, it’s more of a toy than a tool. Relying on it in the law is currently risky.

“The results from the generative AI tools are simply not good enough for the results to simply be relied upon,” Sims says. “You need someone with knowledge/expertise to know what is correct and what requires changing, especially with New Zealand law.

“Knowing what prompts to use works only if you have the ability to judge what is and isn’t correct. This is where knowledge comes in. The ChatGPT comes up with a case. It may be an actual case. But it may have got that case wrong.”

She cites one example she saw in academia where a student entered a question into AI and received an answer containing inaccuracies. Knowing this, the student refined the question two more times until the answer was accurate. The skill was in knowing how to use the tool to fast-track the work. The more desirable graduate will be one who can use the technology in this way.

Lawyers shouldn’t jump into using generative AI without considering the risks. Sims warns lawyers to be very careful about entering client data into public tools such as ChatGPT. “There are all sorts of confidentiality issues there. The reason for that is that [the client data] then becomes part of the data that ChatGPT is using.

“That’s actually a policy that law firms, if they haven’t done already, need to implement immediately.”

The technology will be integrated into legal solutions, intranets or other closed systems that keep the information secure, she says.

Subscribe to LawNews

The weekly online publication is full of journalistic articles written for those in the legal profession. With interviews, thought pieces, case notes and analysis of current legal events, LawNews is a key source of news and insights for anyone working within or alongside the legal field.

Sign in or
become a Member
to join the discussion.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Latest Articles

Gloriavale wins fight to keep its bank accounts – for now

In July 2022, the Bank of New Zealand (BNZ) gave notice of its intention to terminate its banking relationship with various companies and entities associated with the Christian Church Community Trust, more commonly known as the Gloriavale Christian Community.

read more
Loading...