This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 2 minute read

A Major Law Firm is Threatened With Sanctions for Citing "AI Hallucinated" Cases to the Court

 The largest personal injury law firm in the United States, Morgan and Morgan, could be subject to sanctions for citing non-existent cases to a federal court in Wyoming. The non-existent cases were cited in a Motion in Limine filed on January 22, 2025 seeking to limit evidence introduced by a corporate defendant, Walmart, in a personal injury action.  The filing cited a total of nine (9) cases, eight (8) of which did not exist and were nothing more than “AI hallucinations created by the artificial intelligence or "AI"available through ChatGPT.   Unable to locate the cases cited in the filing, the district court judge ordered the lawyers to provide certified copies of the cases, identify how the non-existent case cites were generated, and otherwise show why they should not be sanctioned. The Motion has since been withdrawn by the attorneys citing the “AI hallucinated” cases. 

The use of AI has become a a heated topic between lawyers, law professors, and judges.  While AI can be both useful and economical in conducting legal research, it can also generate non-existent cases or “AI hallucinations” as was done here.  “AI hallucinations” and made-up facts and law can be difficult to detect absent active quality control by the humans using the AI.  As to why these “AI hallucinations occur, when they may pop up, or the circumstance surrounding their occurrence in the first place is currently unknown.  This is especially so with respect to “AI hallucinations” based upon made-up facts and information as opposed to those based upon bias, incorrect data, poor assumptions, or cyberattacks.  However, it has become patently obvious that when AI is used to support important decisions or representations,  that human verification of the conclusions to root out “AI hallucinations” is critical.

Reliance upon false, misleading, or made-up “AI hallucinations” is clearly detrimental to any  business transaction, but especially so when used to support representations to a federal court.  Rule 11 of the federal rules of civil procedure provides that attorney's and pro-se party's certify and represent to the best of their knowledge, information, and belief, formed after an inquiry reasonable under the circumstance the legal arguments are warranted under existing law or by an argument for establishing new law.   Representations based upon "AI hallucinations" made in a motion filed with the court clearly and unequivocally violate this standard thereby subjecting the attorneys signing the motion to sanctions.   

While AI can be very useful and economical to legal research, users must impose quality control “guardrails” to ensure its accuracy and prevent “AI hallucinations”.  Otherwise, users could find themselves in the same unfortunate situation in which the attorneys who filed the Motion citing “AI hallucinations” to the federal court in Wyoming. 

 

 

This matter is before the Court on its own notice. On January 22, 2025, Plaintiffs filed their Motions in Limine. [ECF No. 141]. Therein, Plaintiffs cited nine total cases: The problem with these cases is that none exist, except United States v. Caraway, 534 F.3d 1290 (10th Cir. 2008). The cases are not identifiable by their Westlaw cite, and the Court cannot locate the District of Wyoming cases by their case name in its local Electronic Court Filing System. Defendants aver through counsel that “at least some of these mis-cited cases can be found on ChatGPT.” [ECF No. 150] (providing a picture of ChatGPT locating “Meyer v. City of Cheyenne” through the fake Westlaw identifier).

Tags

construction, nix_jeff, cybersecurity, data privacy, dispute resolution, entertainment sports and media, litigation, insights, appellate, contract disputes, corporate, corporate and business, media, real estate litigation, trial practice, white collar, higher education, insurance, manufacturing, retail, technology