Microsoft Word is getting an AI legal agent, which sounds helpful until you remember how badly this has gone before. The new Legal Agent can review contracts, suggest edits, compare versions, and flag risky clauses inside Word. On paper, these features sound quite useful and convenient, however, cases of generative AI tools hallucinating and inventing entire cases, citations and quotes from thin air have dragged some real people in real court trouble before.
What can Microsoft’s Legal Agent do?
Microsoft says Legal Agent is available through Copilot in Word for users in its Frontier program in the U.S. It currently works on Word for Windows desktop. There is no separate app or installation required, though some users may need to restart Word before the agent appears.
Legal Agent is meant for contract and document review. Microsoft says it can check a contract clause by clause against a legal playbook, review a full agreement, compare different versions, flag risks and obligations, and suggest edits with tracked changes. It is also keeps the original formatting, tables, lists, and negotiation history intact.
The company is also trying to avoid the obvious nightmare scenario for its users and itself. The feature has built-in safeguards like providing citations linked to source language, so reviewers can check suggestions before using them, along with clear disclaimers that it does not provide legal advice, may produce inaccurate content, and still requires review by a qualified legal professional before anything is relied on.
Why should lawyers still be nervous?
There is already precedent for AI going rogue in legal settings as two New York lawyers were sanctioned in 2023 and ordered to pay a $5,000 fine after submitting a court filing that included fake cases generated by ChatGPT. Michael Cohen, Donald Trump’s former lawyer, also admitted that he unknowingly gave his attorney fake case citations generated by Google Bard. While Cohen was not sanctioned, the judge still called the episode embarrassing and stressed the need for skepticism when using AI in legal work.
These are not isolated cases as Judges have questioned or disciplined attorneys in multiple instances involving AI-assisted filings, and one French data scientist and lawyer identified hundreds of court documents containing fake citations and nonexistent references over the past year.

The bigger problem is that hallucinations remain unresolved. AI chatbots can still produce answers that sound confident while being partly or completely wrong. In legal work, that is especially dangerous, because a made-up citation or invented case can end up in a filing and create serious consequences.
Microsoft has put many safeguards on Legal Agent to prevent these issues, however, the lesson is already written in court records. AI can speed up legal work, but the responsibility of fact checking still falls on the lawyer.


.jpg)



