Mississippi Federal Judge Admits AI Was Used to Write Inaccurate Court Order

Mississippi Federal Judge Admits AI Was Used to Write Inaccurate Court Order

User avatar placeholder
Written by Merri

October 24, 2025

A federal judge in Mississippi, Henry T. Wingate, acknowledged that a member of his staff used artificial intelligence to write a mistaken court order — confirming months of speculation. The admission came after U.S. Senator Chuck Grassley launched an inquiry into the matter.

In a letter addressed to the Administrative Office of the Courts, Judge Wingate revealed that one of his law clerks relied on an AI tool called Perplexity to help prepare a court order related to a law banning diversity, equity, and inclusion (DEI) programs in Mississippi’s public schools.

The Error That Sparked Concern

The original order, dated July 20, contained serious factual inaccuracies. It mentioned parties who weren’t involved, misquoted state laws, and even cited a non-existent case. The Mississippi Attorney General’s Office quickly raised alarms about these mistakes.

Judge Wingate later issued a corrected order, deleted the flawed version from the court docket, and denied requests to restore it publicly. He explained that the mistakes were clerical, though he initially refused to elaborate further.

Senator Grassley’s Inquiry and the Judge’s Response

After months of silence, Senator Grassley, Chair of the Senate Judiciary Committee, sent a letter requesting answers about the origin of the inaccuracies and the possible use of AI.
Judge Wingate confirmed that his law clerk used Perplexity AI only to gather public information — no confidential or sealed data was accessed. However, the AI-generated draft was uploaded without the necessary judicial review, leading to the errors.

Wingate emphasized that he would not restore the original erroneous document to the public record to avoid confusion.

Steps Taken to Prevent Future Errors

In his response, Judge Wingate described the flawed order as a draft opinion that was mistakenly filed before review. To ensure it never happens again, he announced new corrective measures, including:

New PolicyPurpose
Independent review for all draft orders and memosTo ensure factual and legal accuracy
Printing and attaching all cited casesTo verify legal references
Regular supervision of AI-assisted workTo maintain ethical standards

He stressed that he manages a busy docket and remains committed to fairness and transparency.

“Given that I hold myself and my staff to the highest standards of conduct, I do not expect that a mistake like this one will occur in the future,” Judge Wingate wrote.

Broader Concerns About AI in the Judiciary

Senator Grassley, a Republican from Iowa, praised Wingate’s honesty but also highlighted the broader implications. He said every federal judge must ensure that AI use does not compromise fairness or litigants’ rights. Grassley urged the judicial branch to adopt strong, permanent AI policies.

Across the legal profession, AI use is rapidly expanding — from legal research to drafting motions. However, experts warn these systems can “hallucinate,” producing false information that looks real.

While some attorneys have been penalized for submitting AI-generated legal filings without verification, similar accountability for judges remains a gray area.

The Federal AI Task Force

To address these concerns, Robert Conrad Jr., director of the Administrative Office of the Courts, announced the creation of a judicial AI task force. This group — made up of judges and tech experts — will recommend new policies for the responsible use of AI in federal courts.

The task force has already released interim guidelines, encouraging lawyers to independently verify AI-generated content and to disclose AI use when drafting court documents.

I create content that converts. Specializing in data-driven articles and persuasive copy, I help businesses turn readers into loyal customers and achieve their marketing goals.

Leave a Comment