4 min read

AI and Law

The rapid development of AI tools that can draft text, summarise documents, and analyse large amounts of information has understandably triggered anxiety. For a profession built on expertise, careful judgement, and years of training, the fact that technology can automate large parts of that work is unsettling at b

AI
In this article

Anxious about AI? You’re not alone…

Anxiety is a subject that comes up a lot in therapy. It’s an issue for most people on the planet of course and is a common feature for highly geared people working in fast paced professions.

But a particular kind of anxiety has come to the fore over the last year or two. Conversations about AI moved from curiosity to concern.

What does this mean for my role?

The rapid development of AI tools that can draft text, summarise documents, and analyse large amounts of information has understandably triggered anxiety. For a profession built on expertise, careful judgement, and years of training, the fact that technology can automate large parts of that work is unsettling at best.

In the therapy room no worries are off the table. We welcome them all; rational or irrational. And current responses to AI and to the general state of the world are rational, of course. It’s about uncertainty. It feels like an existential threat.

Some say that AI is simply automatically a lot of the ‘grunt work’ that junior and trainee lawyers used to be saddled with and that developments will make things better for everyone; allowing lawyers to get on with the ‘real’ work of lawyering.

And yet, didn’t such work help those new to the profession to learn the craft of law - document reviews, first drafts of research notes and the like? Didn’t all that stuff enable them to acquire ‘process knowledge’? Cory Doctorow, an author on the subject talks a lot about ‘process knowledge’ as being ‘the intangible, experiential, and tacit knowledge that workers acquire while producing goods or providing services. It is the "know-how" and muscle memory developed by people, rather than the "know-what" (intellectual property) stored in corporate databases. Losing that process knowledge will surely affect people’s self esteem and confidence. Why? Because, as mind-numbing as the grunt work can be, it lays the foundations for everything that comes later - the more detailed work. It provides the building blocks. And if that is taken away, doesn’t that just leave a rather precarious house of cards (with apologies for mixing metaphors)?

Similarly around AI tools and questionnaires that facilitate first meetings with clients. Some of this stuff is brilliant and saves so much time. But aren’t such tools (at least currently) dependent on the quality of the information inputted? Take two different clients, both experiencing dark thoughts. One might say ‘yes, I am suicidal’ in a pre-meeting questionnaire. The other might answer ‘no’ - because they haven’t actually taken steps to plan ending their own life. Because they’re scared that the lawyer might alert the authorities. If the answer has been ‘no’ then inevitably the lawyer in that first meeting will touch on that particularly difficult issue less than they might otherwise. And something may be lost in that.

All practitioners are facing situations where clients expect faster outputs at lower cost because technology appears to make that possible. In short, expectations of lawyers are being raised even further.

And the clients are emboldened. Chat GPT has made everyone an armchair expert. And it can come up with wildly inaccurate answers. So the authority of lawyers and the value of nuance is being lost. And the pressure on billing continues to increase.

Before we all run for the hills, it’s worth remembering the waves of technological change we’ve all faced before. Online legal research replaced hours spent in libraries. Electronic disclosure transformed litigation. Case management systems changed the way files were organised and shared. Each shift altered the way lawyers worked, and each prompted concerns about whether certain roles would disappear. Yet in practice, technology has tended to change legal work rather than eliminate the need for lawyers altogether. It often removes some routine tasks while creating space for higher value thinking, judgement, and client interaction.

Artificial intelligence is likely to follow a similar pattern. But the threat feels and perhaps is bigger than ever before.

At the moment, AI tools are not as good with context as humans are. Risk assessments and relationships. An understanding of the human implications of a decision. Those elements remain within the domain of skilled professionals. At the moment.

This is a challenging time for those who are prone to anxiety anyway. Many of those working in the legal profession are prone to strong attachments. As wonderful as our lawyer brains are, they get attached to our ways of doing things, to the status quo. As it provides security. Without our familiar structures, we get a bit wibbly.

So these changes do present an existential threat. But it’s also an opportunity to strip back and examine our own attachment styles. To have a look at what we’re clinging to. And to understand that, once you strip away all the accoutrements, the traditional structures, perhaps even the time recording systems (!), perhaps we do have something fundamental to offer as humans.

Perhaps this is a chance for organisations to hone their learning and development, focusing on their people’s humanity, their judgement and their practical skills. Mentoring, collaborative problem solving, and exposure to strategic thinking may become even more important.

Cory Doctorow talks about ‘centaurs’ and ‘reverse centaurs’ in the context of AI. a ‘centaur’ being a person assisted by a machine while a ‘reverse centaur’ is, what Doctorow describes, evocatively as ‘a machine head on a human body, a person who is serving as a squishy meat appendage for an uncaring machine’.

By focusing on how we can be centaurs in this context, we retain a modicum of control. Understanding what AI tools can and cannot do helps to replace vague fears with practical knowledge. Keep technology as a support rather than a competitor.

The legal profession has always been rooted in careful thinking, ethical responsibility, and trust. Those qualities remain essential regardless of how sophisticated technology becomes. While AI is changing aspects of legal work, i hope it does not remove the need for thoughtful professionals who can interpret the law, guide clients through difficult decisions, and apply judgement in situations that rarely have simple answers.

Law is a people business. Let’s keep it that way.

By Annmarie Carvalho

Team round table