The largest personal injury law firm in the United States, Morgan and Morgan, could be subject to sanctions for citing non-existent cases to a federal court in Wyoming. The non-existent cases were cited in a Motion in Limine filed on January 22, 2025 seeking to limit evidence introduced by a corporate defendant, Walmart, in a personal injury action. The filing cited a total of nine (9) cases, eight (8) of which did not exist and were nothing more than “AI hallucinations” created by the artificial intelligence or "AI"available through ChatGPT. Unable to locate the cases cited in the filing, the district court judge ordered the lawyers to provide certified copies of the cases, identify how the non-existent case cites were generated, and otherwise show why they should not be sanctioned. The Motion has since been withdrawn by the attorneys citing the “AI hallucinated” cases.
The use of AI has become a a heated topic between lawyers, law professors, and judges. While AI can be both useful and economical in conducting legal research, it can also generate non-existent cases or “AI hallucinations” as was done here. “AI hallucinations” and made-up facts and law can be difficult to detect absent active quality control by the humans using the AI. As to why these “AI hallucinations” occur, when they may pop up, or the circumstance surrounding their occurrence in the first place is currently unknown. This is especially so with respect to “AI hallucinations” based upon made-up facts and information as opposed to those based upon bias, incorrect data, poor assumptions, or cyberattacks. However, it has become patently obvious that when AI is used to support important decisions or representations, that human verification of the conclusions to root out “AI hallucinations” is critical.
Reliance upon false, misleading, or made-up “AI hallucinations” is clearly detrimental to any business transaction, but especially so when used to support representations to a federal court. Rule 11 of the federal rules of civil procedure provides that attorney's and pro-se party's certify and represent to the best of their knowledge, information, and belief, formed after an inquiry reasonable under the circumstance the legal arguments are warranted under existing law or by an argument for establishing new law. Representations based upon "AI hallucinations" made in a motion filed with the court clearly and unequivocally violate this standard thereby subjecting the attorneys signing the motion to sanctions.
While AI can be very useful and economical to legal research, users must impose quality control “guardrails” to ensure its accuracy and prevent “AI hallucinations”. Otherwise, users could find themselves in the same unfortunate situation in which the attorneys who filed the Motion citing “AI hallucinations” to the federal court in Wyoming.