• Skip to main content
  • Skip to footer

Ward Law, LLC

  • Home
  • Practices
    • Commercial and Business Litigation
    • Construction Litigation
    • Directors and Officers Liability
    • ERISA and Employee Benefits
    • General Corporate
    • General Liability
    • Homeowners/Condominium Associations
    • Insurance Coverage
    • Labor and Employment Law
    • Professional Liability
  • Team
    • ATTORNEYS
      • Jennifer Ward
      • Barry Brownstein
      • Jeremy Rogers
      • Christopher Curci
      • Mark Stephenson
      • Ross G. Currie
      • Deborah Gnatt
      • Renee Harris
      • Robert H. Graff
      • Xiaoyu Pu
    • staff
      • Fred Hosaisy
      • Alix Fequiere
      • Alisha Gasper
      • Alyssa Wilson
      • Phylise Wilson
      • Lauren Shission
      • Christian Ear
      • Dillon J. Berry
  • Blog
  • News
  • Diversity
  • Careers
  • Contact
  • Facebook
  • Instagram
  • LinkedIn

Contact Us

The Risk of Using AI-Generated Work and the Potential Liabilities of Lawyers 

July 31, 2023 By Ward Law, LLC

By: Ross G. Currie, Esq.

In the past few months, the use of AI has rapidly spread from education to the legal profession. From a user perspective, the economic appeal of AI-generated works is undeniable. Lawyers and firms that effectively leverage emerging AI technologies will be able to provide their services at a reduced cost, with greater efficiency, and with higher odds of favorable outcomes in litigation.  

Consider one of the most time-consuming tasks in litigation: distilling the most important, relevant information from a vast collection of documents produced during discovery. AI can significantly accelerate this process, doing work in seconds that might take even the most productive lawyers take days or weeks to complete. There are, however, limits to what lawyers can—or should—rely on AI technology for.  

In a recent headline-making case, a New York lawyer used Chat GPT for assistance with writing an affirmation in opposition to a motion to dismiss, to his and a colleague’s peril. Attorneys Peter LoDuca and Steven Schwartz’s firm has been suing the Colombian airline Avianca on behalf of Roberto Mata, who claims he was injured on a flight to John F. Kennedy International Airport in New York City. In an affirmation responding to Avianca’s motion to dismiss, plaintiff’s counsel cited more than half a dozen non-existent cases, including “Varghese v. China Southern Airlines,” “Martinez v. Delta Airlines,” and “Miller v. United Airlines.”  

When neither counsel for Avianca nor the court could locate the cited cases, the court ordered LoDuca—who was counsel of record and signed the offending filing—to show cause why he should not be sanctioned. LoDuca submitted an affidavit indicating that he personally had not performed any of the research or written the affirmation. Instead, Schwartz had done the research and writing, which LoDuca then signed and filed the affirmation because Schwartz was not admitted to practice in the United States District Court for the Southern District of New York. Schwartz, in turn, filed an affidavit indicating that he had “consulted the artificial intelligence website Chat GPT in order to supplement the legal research” for the filing. Schwartz further attested that he relied on ChatGPT and was unaware that ChatGPT’s contents could be false. 

The judge in the case was not moved and sanctioned the attorneys and their firm for submitting false filings to the court. Although the judge “only” imposed $5,000.00 in monetary sanctions, the publicity of the case has perhaps irreparably damaged these attorneys’ reputations with the bench and bar. 

This cautionary tale is a reminder that AI tools, while useful, are just that—tools. AI is not a substitute for thoughtful—and reliable—writing and advocacy. AI generators are typically trained by analyzing vast databases and synthesizing information—information which may or may not be accurate or consistent. The ordinary users of the AI generator will likely have little-to-no idea what algorithms or source databases were used to train the AI system. This case is also a reminder that lawyers should always verify information and citations submitted to a court, as an adversary and the judge will almost certainly check the sources. Lawyers’ ethical obligations—to clients, adversaries, and courts—still require that lawyers, not AI, take responsibility for maintaining the integrity of the judicial system. Failing to heed these lessons will place lawyers at risk of severe penalties, such as liability for legal malpractice, suspension, or even disbarment.  

StraightforWARD Legal Advice:
Legal professionals with questions about AI and professional liabilities should contact Ross G. Currie at (215) 647-6604 or rcurrie@thewardlaw.com.  

Filed Under: Blog, Professional Liability Tagged With: Lawyers Professional Liability, Legal Malpractice, National, Professional Liability

Footer

Philadelphia, PA
One Penn Center
1617 John F. Kennedy Blvd
Suite 500
Philadelphia, PA 19103

(215) 647-6600


New York, NY
15 West 38th Street
4th Floor, Suite 740
New York, NY 10018

(646) 380-1823

Moorestown, NJ
39 E. Main St
Moorestown, NJ 08057

(856) 637-4200

Tampa, FL
8875 Hidden River Pkwy
Suite 300
Tampa, FL 33637

(813) 558-3387

Copyright © 2025 · Ward Law, LLC & MyAdvice · All Rights Reserved