Don’t Be a Chatbot Lawyer. By Chinua Asuzu

LegalLinkz


The phrase “chatbot lawyer” is a neologism coined by Eugene Volokh to denote “a lawyer who relies on a chatbot such as ChatGPT to generate briefs and other legal documents.” Bryan A. Garner, ‘Will “chatbot lawyer” make it into Black’s Law Dictionary?’ 1 June 2023, abajournal.com.

Large language models like ChatGPT have revolutionized legal research and writing. With a few carefully framed prompts, you can generate draft articles, blogs, briefs, contracts, and pleadings. But when you fail to cross-check, drastically edit, and adapt AI-generated drafts, or you begin to rely on AI to carry out your legal analysis, you’re a chatbot lawyer.

Every good lawyer knows that legal research and writing demand acumen, discernment, insight, judgment, and sagacity. None of these can be automated. Your AI aides don’t read cases; they predict text. They don’t weigh authority; they mimic patterns.

And they shouldn’t usurp your obligation to apply the law to specific facts with intellectual rigor. Besides, they’re in the habit of hallucinating: generating false, fabricated, misleading, and nonexistent cases, data, maxims, principles, rules, and statutes.

- Advertisement -
Ad image

Treat your AI aide (ChatGPT, etc.) as a research assistant, but remember to crosscheck and confirm whatever it generates–cases, rules, statutes, etc.—and do so in the original sources.

Never use your AI aide to bypass the hard work of reading authorities, identifying legal issues, or crafting persuasive arguments. Read the cases yourself. Trace the reasoning yourself. Test the logic yourself. Make the judgment yourself.

The practice of law is the disciplined application of the human mind to legal problems. A chatbot can imitate the form of legal writing, but not its substance.

Receive deliverance from automation bias—the blind reliance on artificial intelligence.

Also read,

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *