Lawyer reviewing legal documents at a desk

A partner at a mid-size litigation firm told me something that stuck. She said she used to bill four hours a week doing nothing but reading case law she already knew — just to confirm it was still good law, that nothing had been overruled, that she wasn't about to walk into court citing something that had been quietly gutted by an appellate ruling two years ago. Four hours a week. Every week. That's now twenty minutes with Lexis+ AI.

That's not a revolution. It's not AI replacing lawyers. It's AI giving a very good lawyer four hours of her week back. And that, more than any of the breathless vendor demos, is what the adoption of AI in legal actually looks like right now.

This piece covers the tools that are genuinely making their way into law firms and legal departments — not the ones with the best marketing decks, but the ones that partners are quietly renewing subscriptions for and associates are using late on a Tuesday to get ahead of a filing deadline.

First, a Reality Check on Where Legal AI Actually Stands

The legal industry is not slow to adopt AI. It is slow to adopt AI carelessly. And that's not the same thing.

Lawyers operate in a world where a confident wrong answer isn't just an inconvenience — it's malpractice. The entire profession is built around the idea that you don't say something you can't source. So when AI tools started hallucinating citations to cases that don't exist — and several lawyers discovered this the hard way, including one who submitted fabricated case citations to a federal judge in 2023 and faced sanctions — the profession didn't panic. It got careful.

The result is a nuanced landscape where adoption is high, but concentrated. Lawyers are using AI aggressively in specific, well-defined tasks where the risk of a confident wrong answer is either low or easily caught. They're avoiding it in contexts where a single unverified sentence in a brief could blow up a case.

Lawyers aren't resisting AI. They're using it in the places where a mistake doesn't end a client's case. That's not caution — that's professional judgment.

With that framing, let's look at what's actually in use.

Legal Research: Where AI Has Already Won

This is the most mature and widely adopted category, and it's not particularly close. Legal research was always labor-intensive, expensive, and deeply unpleasant for junior associates who spent weekends reading case after case looking for a single relevant paragraph. AI has hit this use case hard and well.

Lexis+ AI (LexisNexis)

LexisNexis has been in legal research longer than most of its users have been alive, so they had the database advantage from day one. Lexis+ AI layers a conversational interface on top of their existing corpus — which means you can ask it "find me cases in the Ninth Circuit from the last five years where courts have allowed piercing the corporate veil on an alter ego theory" and get a sourced, cited response rather than a list of results you still have to read.

The key feature lawyers actually care about: every answer cites real cases, and those citations link directly to the verified document in the Lexis database. That closed loop — AI answer plus live citation check — is what separates a tool lawyers will actually trust from one that sounds impressive in a pitch.

The criticism you hear most: it's expensive, and if you're already a Westlaw shop, the switching cost is real. But firms that have made the transition tend to stay.

Westlaw Precision + CoCounsel (Thomson Reuters)

Thomson Reuters acquired Casetext — and with it, CoCounsel — in 2023, and since then they've been integrating CoCounsel's AI capabilities directly into Westlaw. The result in 2026 is a deeply capable research assistant that can draft memos, summarize depositions, review contracts, and answer research questions with cited sources.

What CoCounsel does differently from a generic LLM: it only answers from the legal database it's connected to. If it doesn't find support for something, it says so. That's a deliberate design choice that looks like a limitation but is actually the product's main feature. Lawyers don't need an AI that's always willing to give an answer. They need one that tells them when there isn't one.

CoCounsel is currently the most widely deployed legal AI in large firms. If you're at a BigLaw firm in the US, there's a decent chance your firm has an enterprise license and your colleagues are using it right now.

Harvey

Harvey is the name that comes up most often in conversations about where legal AI is heading. Built on top of the most capable foundation models and fine-tuned specifically on legal data, Harvey isn't trying to be a better Westlaw. It's trying to be something closer to a junior lawyer who never sleeps.

Firms like Allen & Overy, PwC Legal, and several major US litigation shops were early Harvey adopters. What they found: Harvey is genuinely useful for first-draft legal memos, translating complex agreements into plain summaries for clients, and synthesizing research across multiple documents. It's not a citation-checking tool — it's a drafting and synthesis tool, and that distinction matters when you're deciding where to trust it.

The honest knock on Harvey: it's not cheap, it requires enterprise deployment, and smaller firms often can't justify the cost relative to what CoCounsel or even a well-prompted Claude offers. But for high-volume practices with the budget, it's hard to argue against.

Contract Work: Drafting, Review, and Negotiation

After legal research, contract work is where AI has found the most traction in law. Contracts are structured, repetitive documents with predictable patterns — exactly the kind of thing language models are good at. And the volume of contracts that flow through any mid-size deal or corporate legal team creates constant pressure to move faster without missing things.

Spellbook

Spellbook lives inside Microsoft Word and acts as a real-time drafting assistant for contracts. You're working in your document, you highlight a clause, and Spellbook offers suggestions, flags issues, or drafts alternative language. It also has a "playbook" feature where you upload your firm's preferred positions on negotiation points, and Spellbook uses those as a baseline when reviewing third-party paper.

The Word integration is what makes it stick. Lawyers who have used it describe it as the difference between having an AI tool and actually using one. There's no context switch, no copying and pasting between windows. The AI is just there, inside the document they're already working in.

Good fit for: transactional lawyers who handle a high volume of commercial agreements, NDAs, and vendor contracts. Less useful for bespoke M&A work where every deal is different enough that templates become a liability.

Ironclad

Ironclad is a contract lifecycle management platform that has layered AI on top of an already solid workflow foundation. The AI features focus on contract review — flagging non-standard terms, surfacing risk, and identifying deviations from your organization's standard positions.

It's used more heavily on the corporate legal department side than at law firms. A general counsel's office handling hundreds of vendor agreements a year benefits enormously from AI that can pre-screen contracts and escalate only the genuinely unusual provisions. It doesn't replace lawyer review, but it radically changes what gets escalated and what gets approved in routine.

Kira Systems (now part of Litera)

Kira built its reputation in due diligence. When you're reviewing a data room full of contracts as part of an acquisition, Kira extracts clauses, categorizes provisions, and surfaces defined terms across hundreds of documents — the kind of work that used to take teams of junior associates weeks. It's now part of Litera's broader document management ecosystem, and firms that do significant M&A work treat it as standard infrastructure.

Litigation Support and E-Discovery

This is where the document volumes get truly large, and where AI has been doing useful work longer than most people realize. E-discovery platforms have used machine learning for predictive coding — teaching the system which documents are relevant — for over a decade. What's changed is the quality of the underlying models and the scope of what they can do.

Relativity

Relativity is the dominant platform in e-discovery, and its AI features — particularly around document review and privilege logging — have matured significantly. The core value is the same it's always been: help humans review millions of documents for relevance and responsiveness without reading every one. The AI layer identifies likely-relevant documents, surfaces clusters of similar content, and flags potential privilege issues for human review.

What's changed in recent years: the conversational search layer. You can now describe in plain language what you're looking for — "emails discussing the board's awareness of the regulatory issue before the 2022 announcement" — and Relativity surfaces candidates rather than requiring Boolean query construction. For litigation teams under time pressure, that's a meaningful improvement.

EvenUp

EvenUp is a specialized tool for personal injury and mass tort practices that generates demand letters and case summaries from medical records and case documents. It's not a general-purpose legal AI — it's deeply tailored to a specific workflow in a specific practice area, which is exactly why plaintiff-side PI firms have adopted it at scale.

A demand letter that would have taken a paralegal a full day to compile from scattered medical records takes EvenUp about twenty minutes. At the volume most PI firms operate, that time saving is transformative. It's a good example of a vertical legal AI that wins precisely because it's narrow.

The Wildcard: Lawyers Using Claude and ChatGPT Directly

Here's what none of the official vendor surveys fully capture: a significant number of lawyers are using general-purpose LLMs — Claude, ChatGPT, sometimes Gemini — for legal work, often on their own devices, outside of any firm-approved tool.

They're using them for things like: drafting client communication in plain English from a dense legal analysis they've already done. Summarizing a 200-page expert report the night before a deposition. Translating a foreign-language contract clause for context before sending it to a certified translator. Generating a first outline for a brief they'll then research and write properly.

These use cases are real, they're widespread, and they're not going away. The smarter law firms have stopped pretending they can prevent it and started building policies that acknowledge the reality — acceptable use guidelines, confidentiality reminders, review requirements for anything AI-drafted that goes to a client or court.

The key constraint: don't put confidential client information into a commercial LLM without understanding what happens to that data. Most of the major providers offer enterprise tiers with data isolation. Most lawyers using free-tier ChatGPT on their personal laptops are not thinking about this. Firms that care about this — and every BigLaw firm should — provide approved, enterprise-contracted access so lawyers have a safe outlet for legitimate use.

The Issue Nobody Likes Talking About: Hallucinations in Legal Contexts

AI hallucinations are annoying in most contexts. In legal contexts, they can be career-ending.

The high-profile cases of lawyers submitting AI-generated briefs containing fabricated case citations have created a healthy professional paranoia that is, frankly, appropriate. The lesson is not "don't use AI." The lesson is "never submit anything AI-generated without checking every single citation against the primary source." Every single one. No exceptions.

The tools built specifically for legal — Lexis+ AI, CoCounsel, Harvey — have invested heavily in grounding their outputs in verified sources and flagging uncertainty. That's a real advantage over general-purpose LLMs. But no tool eliminates the need for human verification in anything that goes to a client, a court, or any document with legal consequences.

Professionals who have built this habit report that the verification step takes less time than people fear — often five minutes for a short memo, twenty for a longer brief. Given that the AI saved three hours of drafting, the math still works. It just requires not skipping the check.

What the Adoption Data Actually Shows

Multiple surveys of legal professionals in 2025 and early 2026 have produced a consistent picture:

  • Legal research is the highest-adoption use case, with a majority of lawyers at large firms using AI-assisted research tools regularly.
  • Contract review follows closely, particularly at corporate legal departments dealing with high-volume commercial work.
  • Brief drafting and motion practice see significant use of AI for first drafts, with human revision as standard expectation.
  • Client communication — translating legal analysis into plain language — is a rapidly growing use case that most surveys undercount because it often happens through unofficial channels.
  • Court filings and opinions submitted to judges remain the area of highest caution, with most firms requiring explicit sign-off from a partner that AI-assisted content has been fully reviewed.

What the data also shows: adoption is much higher among associates than partners, and much higher at larger firms than smaller ones. But the gap is closing. Solo practitioners and small firms are finding that general-purpose AI is accessible enough to offset the cost of purpose-built legal tools, even if the latter are more trustworthy for high-stakes work.

How to Actually Choose a Tool

The honest answer is that the right choice depends almost entirely on practice area and volume, not on feature comparisons in vendor demos. A framework that works:

  • High-volume research practice (BigLaw, litigation boutique): Westlaw CoCounsel or Lexis+ AI. Pick the one your firm already has a database relationship with. The database quality matters more than the AI interface.
  • High-volume contract work (corporate, transactional, in-house): Spellbook for drafting in Word, Ironclad if you need full lifecycle management, Kira/Litera if M&A due diligence is a regular workflow.
  • Litigation support and e-discovery: Relativity if you're handling large document volumes. EvenUp if you're specifically a PI or mass tort firm.
  • Smaller firm or solo practice: Claude or ChatGPT with an enterprise subscription and a clear internal policy about what goes in and what needs verification before it goes out. The purpose-built tools may not be cost-justified at your volume.
  • Anyone exploring Harvey: worth a pilot if you have the budget and do enough work that the drafting assistance genuinely moves the needle. It's not a tool to acquire out of curiosity.

The Bigger Picture

The conversation in legal circles has moved past "will AI replace lawyers?" to something more nuanced and more interesting: what does legal work look like when the routine parts are fast?

The answer seems to be that the premium moves upstream. The time a senior lawyer saves on research and drafting doesn't disappear — it goes into the work that AI genuinely can't do. Reading a room in a deposition. Knowing which argument will resonate with this particular judge. Understanding what a client actually needs versus what they're asking for. Building the trust that comes from being the person who was right when it mattered.

The lawyers who are thriving with AI in 2026 are not the ones who found a tool that does their job. They're the ones who found tools that eliminate the parts of their job that were always just overhead — and freed themselves up to do the parts that only they can do.

That, it turns out, is a very good use of artificial intelligence.

Jaime Delgado

Jaime Delgado

Product Analyst & AI early adopter

Jaime has been tracking the AI landscape since the GPT-3 era. He writes about AI capabilities, model comparisons, and practical applications for builders and founders. His daily driver is Claude inside Visual Studio Code — though he also reaches for Grok, Gemini, and ChatGPT when the question is quick and the context is light. He stays genuinely open to every AI that comes along: the landscape moves fast, and so does he. Based in Spain.

View on LinkedIn