Tech

Microsoft is building an AI agent for layers in Word. Let’s hope it doesn’t go badly.

Microsoft Word is getting an AI legal agent, which sounds useful until you remember how badly this has gone before. The new Legal Agent can review contracts, suggest edits, compare versions, and flag dangerous clauses within Word. On paper, these features sound useful and helpful, however, cases of AI tools generating fraud and inventing whole cases, quotes and citations out of thin air have dragged real people into real court trouble before.

What can a Microsoft legal representative do?

Microsoft says Legal Agent is available through Copilot in Word for users on its Frontier program in the US It currently runs on Word for Windows desktop. No separate application or installation is required, although some users may need to restart Word before the agent appears.

A Legal Agent is appointed for contract and document review. Microsoft says it can check the contract clause by clause against the official playbook, review the full agreement, compare different versions, flag risks and obligations, and suggest edits and tracked changes. It also keeps the original formatting, tables, lists, and discussion history intact.

The company also tries to avoid the obvious situation of its users and itself. The feature has built-in safeguards such as providing citations linked to the source language, so reviewers can check suggestions before using them, and disclaimers that they do not provide legal advice, may generate inappropriate content, and still need to be reviewed by a qualified legal professional before anything is trusted.

Why should lawyers still be nervous?

There is already precedent for AI going wrong in legal situations as two New York attorneys were convicted in 2023 and ordered to pay a $5,000 fine after filing a lawsuit involving fake lawsuits generated by ChatGPT. Michael Cohen, Donald Trump’s former lawyer, also admitted that he unknowingly gave his lawyer citations of false accusations made by Google Bard. Although Cohen was not allowed, the judge still called the episode scandalous and emphasized the need to be skeptical when using AI in the legal profession.

These are not isolated cases as judges have questioned or punished lawyers in many cases involving AI-assisted filing, and a French data scientist and lawyer has identified hundreds of court documents containing false quotes and missing references over the past year.

The biggest problem is that negative ideas remain unresolved. AI chatbots can still generate answers that sound confident while being partially or completely incorrect. In the legal profession, that is very dangerous, because a citation made or a case established can end up being filed and cause serious consequences.

Microsoft has put many safeguards in the Legal Agent to prevent these problems, however, the lesson is already documented in the court records. AI can speed up the legal process, but the responsibility for fact-checking is still in the hands of the lawyer.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button