Womenz Mag

Judge Stranch Warns “ChatGPT Does Not Verify Accuracy” After Colleague Cites AI in Racial Case

Chad Readler
Photo by Tom Williams via Getty Images

A federal appeals judge turned to an AI chatbot and Urban Dictionary to help figure out whether the phrase “monkey a–” carried a racial slur when used against Black employees.

The unusual move came during a discrimination case brought by Thomas Michael Smith and Monaleto Sneed, who said their supervisors at PAM Transport Inc. targeted them with terms like “monkey” and “monkey a–.” They argued the insults were racially charged and part of broader discrimination at the trucking company.

Last year, District Judge Eli Richardson in Tennessee sided with PAM Transport, tossing out the claims. But on Thursday, a three-judge panel at the Sixth Circuit Court of Appeals reversed that ruling, breathing new life into the lawsuit, reported the Daily Beast.

Donald Trump
Readler, who was nominated to the appeals court by Donald Trump in 2018 (Photo by Leon Neal/Getty Images)

Judge Chad Readler, a Trump appointee, wrote a concurring opinion that stood out for its methods. He admitted the Tennessee court “had difficult issues to address in the delicate setting of race discrimination.” One question, he said, was how to assess intent and context when “the individual who purportedly engaged in race discrimination is a member of the plaintiff’s race.”

That’s when Readler turned to unconventional sources. “Does the term ‘monkey a–,’ a phrase understandably not included in traditional dictionaries, have the same racial connotation as the term ‘monkey’?” he wrote. To help answer, he cited definitions from Urban Dictionary and even ChatGPT.

Get our daily round-up direct to your inbox

When Readler asked the chatbot what the phrase meant, it responded, “Racial? Not inherently — but can be, depending on how and to whom it’s said.”

While Readler included the AI analysis in his concurrence, the panel’s main opinion, written by Judge Jane Stranch, an Obama appointee, made the call more directly. Stranch wrote that “there is no meaningful difference between the terms ‘monkey’ and ‘monkey a–’ when used by a supervisor against an African American employee, as alleged here.”

Stranch also noted in a footnote that Readler’s reliance on ChatGPT was unusual. She explained that while ChatGPT can consolidate massive amounts of text, “it does not independently verify the accuracy of any material or its unknown sources.”

The Sixth Circuit’s ruling clears the way for Smith and Sneed to continue their case, a major reversal from the Tennessee district court’s earlier dismissal. It also highlights how questions of language and race can land in unexpected legal territory — and how even judges are now testing tools like AI chatbots when conventional references fall short.

Readler, who was nominated to the appeals court by Donald Trump in 2018 and confirmed the following year, has not commented publicly on why he included ChatGPT’s explanation in his reasoning.

The case now heads back to court, but the fact that AI made its way into a federal opinion has already drawn attention. Whether or not it becomes more common, this ruling is a reminder of just how unsettled and sensitive the legal system’s handling of language and racial discrimination can be.

Related posts

Pam Bondi Opens Epstein Investigation after Trump Targets Bill Clinton and Other Rivals

Bente Birkeland

“That does not constitute government coercion”: JD Vance Defends FCC After Kimmel Suspension

Alex Williams

Jasmine Crockett Exposes MAGA Conspiracy Theories Over Charlie Kirk Murder

Bente Birkeland