The Florida Bar

Florida Bar News

AI missteps in court filings prompt Florida Bar rules review

Senior Editor Top Stories
Chief Judge Mark W. Klingensmith

Chief Judge Mark W. Klingensmith, chair of the Florida Conference of DCA Judges, is concerned existing rules don’t go far enough to address generative AI’s inherent risks.

Asked to consider AI-generated court filings marred by “hallucinations,” Florida Bar rules committees are weighing a state version of Federal Rule 11, which calls for tough sanctions for not being truthful to a court.

Fourth DCA Chief Judge Mark Klingensmith, who issued a referral letter to the committees as chair of the Florida Conference of DCA Judges, is concerned existing rules don’t go far enough to address generative AI’s inherent risks.

“So, to speak with one voice through the court system, we thought perhaps the rules committees should take a look at it and see if they felt that it was something that needed to be addressed,” Klingensmith said.

The February 20 referral letter is addressed to the Rules of General Practice and Judicial Administration Committee, and the Appellate Court Rules Committee.

“We didn’t take a position on which track to take, what language to use, we just wanted to tee up the discussion and let them know this is something we’re keeping an eye on,” Klingensmith said.

Generative AI, which allows a lawyer to input the facts of a case and receive an on-point brief in seconds, uses machine learning to comb the digital universe for all available data. Its creators don’t know how the technology arrives at any given answer and warn that it can “hallucinate,” or deliver fictional results that sound convincing.

Lawyers have already been disciplined for misusing it.

The first red flag appeared in June 2023 with Mata v. Avianca, Inc., a case in the Southern District of New York where the plaintiff’s attorneys faced Rule 11 sanctions for filing a Chat GPT-generated brief that cited to non-existent judicial opinions.

In October 2023, incidents cropped up in federal courts in Georgia and New Mexico. By December 2023, AI-generated hallucinations turned up in a second Southern District of New York case, U.S. v. Cohen, according to a report by the New York Federal Court Bar Association.

“These are not pro se litigants doing research at home,” Klingensmith said. “Some very good lawyers, and some very well-known law firms, have had this happen to them.”

In March of 2024, the U.S. District Court for the Middle District of Florida suspended attorney Thomas Grant Neusom for a year.

A grievance committee report noted that Neusom filed pleadings that “contain inaccurate authorities to support what appear to be mostly frivolous legal arguments, in violation of Rule 4-1.3.”

According to the report, Neusom told the committee that he used Westlaw and Fastcase, and “may have used artificial intelligence to draft the filing(s), but that he was not able to check the excerpts and citations.”

Klingensmith said that before the referral was issued, the conference was unaware of any incidents in Florida. He has yet to see any in the Fourth DCA.

“So, the executive committee, in consultation with our colleagues, on my court and the other chief judges on theirs, decided to take a wait-and-see attitude, to see if the Bar would do anything to address the issue,” he said.

At that point, a Bar panel had been tackling the challenge for months. Immediate past President Scott Westheimer announced the Special Committee on AI Tools & Resources at his swearing in ceremony in June 2023.

The panel filed a formal request for an ethics opinion to address generative AI and its ethical implications for everything from a lawyer’s duty to competence, to client confidentiality, lawyer advertising, and fees.

Before that, the panel proposed amendments to comments to various Bar rules. The Board of Governors approved the proposed amendments and they remain pending with the Supreme Court. The AI committee has also asked Bar rules committees to address generative AI. Unlike rules, comments to rules are aspirational. Ethics opinions are non-binding.

Klingensmith said the DCA judges were encouraged when the Board of Governors voted January 19 to approve Ethics Opinion 24-1. But the judges also noted that after the vote, Westheimer remarked that some DCA judges told him they were seeing AI-generated briefs that contained errors.

Without suggesting specific language, the DCA judges’ referral letter asked the committees to consider amendments to Rule of General Practice and Judicial Administration 2.515 (Signature and Certificates of Attorneys and Parties), and Rule of Appellate Procedure 9.210 (Briefs), “or others to appropriately address lawyers’ use of AI in court filings.”

The judges also asked the committees to “consider the feasibility of enacting certification language in a standard format for filings in all courts of the state, or by incorporating a standard as in Rule 2.515 whenever a signature is affixed to any pleading motion or brief filed with a court, that the submission has been prepared consistent with the ethical obligations referenced in Ethics Advisory Opinion 24-1, including, but not limited to, the lawyer’s duties of competence (Rule 4-1.1), and candor to the tribunal (Rule 4-3.3).”

The letter noted that “several courts, particularly in the federal system, are considering or already require the filing of a certificate attesting either 1) that no portion of any filing has been drafted by generative AI (such as Chat GPT, Harvey AI, or Google Bard); or 2) that any language or content drafted with the assistance of a generative AI platform on the basis of natural language prompts has been checked for accuracy by a human being using print reporters or traditional legal databases.”

The letter pointed to a rule that was being considered by the U.S. Court of Appeals for the Fifth Circuit. Other courts are considering similar proposals in Hawaii, New Jersey, Oklahoma, and Pennsylvania.

In January, the U.S. Court of Appeals for the Second Circuit issued a decision in Park v. Kim, No. 22-2057, 2024 WL 332478 at 4 (2d Cir. Jan. 30,2024), in which the court determined that a local rule is not necessary to impose sanctions for AI-related hallucinations because Federal Rule 11 already covers it, Klingensmith noted.

In the case, an attorney filed a brief, after missing two deadlines, that referred to the Matter of Bourguignon v. Coordinated Behavioral Health Servs., Inc., 114 A.D. 3d 947 (3d Dep’t 2014). The cite was later determined to be an AI fabrication.

In an AI seminar at the Annual Florida Bar Convention in June, Belmont University Law Professor Tim Chinaris said the Fifth Circuit rejected the rule proposal after opponents argued it was unnecessary.

“I think a lot of federal judges were promulgating their own rules because they were unsure about that,” Klingensmith said. “But we now have opinions out of the Fifth Circuit and the Second Circuit, which say, ‘yes, Rule 11 does apply to this situation,’ and they can take the appropriate action when necessary.”

One reason the DCA judges issued a referral to Bar rules committees is to avoid taking a piecemeal approach, Klingensmith said. Klingensmith has been attending meetings of a rules subcommittee that was created to respond to the referral.

The subcommittee is focusing on Federal Rule 11, which requires, in relevant part, that attorneys or parties certify “to the best of the person’s knowledge, information and belief, formed after an inquiry reasonable under the circumstances…the claims, defenses, and other legal contentions are warranted by existing law or by a nonfrivolous argument for extending, modifying, or reversing existing law…the factual contentions have evidentiary support,” or are likely to have evidentiary support.

Klingensmith says the subcommittee is considering two competing proposals.

“So, I’ll call ‘Proposal A’ the addition of Rule 11 language [to] Rule 2.515.”

Proposal B, Klingensmith says, involves adding language to Rule 11.

“Some members of the subcommittee believe that Rule 11 is good, but it doesn’t go far enough to address some of the other issues,” he said.

Special Committee on AI Tools & Resources Co-Chair Duffy Myrtetus, a veteran member of the Board of Governors,  proposed using language from a Virginia code that is similar to Rule 11.

Twelfth Judicial Circuit Judge Matt Whyte, in an email to the subcommittee, argued for adopting Federal Rule 11 wholesale.

“I am a big believer in taking something that works in another area/field and shamelessly adapting it to our needs,” he said. “Rule 11 has been around for a long time, is (or should be) familiar to every lawyer (assuming Fed Civ Pro is still required at every law school), and has a substantial body of caselaw supporting, defining, and explaining it.”

The full rules committees will have to weigh whatever product the subcommittee submits. The Board of Governors eventually will be called upon to make a recommendation. A final draft will be posted for public comment.

The Florida Supreme Court will have the final say.

News in Photos

Columns

Be a Constitutional Lawyer

Columns | May 12, 2025

Mindfulness and Secondary Trauma

Columns | Apr 23, 2025

Be a Diplomatic Lawyer

Columns | Apr 22, 2025

Be a Lincoln Lawyer

Columns | Mar 10, 2025