Every entry below is from the public record: a sanctioned attorney, a disqualified counsel, a published warning. The professional rules have not changed: a lawyer is responsible for every citation in a filing they sign, regardless of who or what produced it.
S.D.N.Y. · 2023
$5,000 sanction
Mata v. Avianca, Inc. Two attorneys submitted a brief citing six fabricated cases generated by ChatGPT. The judge wrote that he might not have sanctioned them had they come clean, but they didn't.
Mata v. Avianca, 678 F. Supp. 3d 443 (S.D.N.Y. 2023)↗M.D. Fla. · 2024
1-year federal suspension
A Florida attorney was suspended from practicing in the Middle District of Florida after filing pleadings with cases that, in the court's words, were completely fabricated.
In re Neusom · 1-year suspension, March 8, 2024↗Cal. Ct. App. · 2025
$10,000 · 21 of 23 quotes fabricated
A California appellate court fined an attorney for filing an opening brief in which 21 of 23 quoted authorities were fabricated. The court published the opinion as a warning to the bar.
California 2nd District Court of Appeal, Sept. 2025↗N.D. Ala. · 2025
Disqualified · referred to bar
Johnson v. Dunn. A large firm's hallucinated citation led the court to disqualify the offending attorneys, publish the opinion in the Federal Supplement, and direct the clerk to notify bar regulators in every state where they were licensed.
Johnson v. Dunn, No. 2:21-cv-1701 (N.D. Ala., July 23, 2025)↗D. Or. · 2026
$109,700 in sanctions and costs
A federal court in Oregon ordered a lawyer to pay $109,700 in sanctions and costs for filing AI-generated errors, reported as one of the largest such penalties to date.
Reported by NPR, April 2026↗Worldwide · ongoing
1,200+ documented cases
Damien Charlotin, a researcher at HEC Paris, maintains a public database of court decisions involving AI hallucinations. By April 2026 it had logged more than 1,200 cases, roughly 800 from U.S. courts.
AI Hallucination Cases · damiencharlotin.com↗