← All Resources

Compliance Update

What Courts Are Saying About AI in Legal Filings

By Fractal Legal · March 2026

In June 2023, a New York attorney was sanctioned for submitting a brief filled with fabricated case citations generated by ChatGPT. The Mata v. Avianca case made national headlines and became a cautionary tale. But the legal profession’s response has moved far beyond cautionary tales. Courts across the country are now issuing specific rules about how AI can — and cannot — be used in legal filings.

For New York firms, understanding this rapidly evolving landscape is not optional. It is a compliance requirement.

The Wave of Standing Orders

Since Mata v. Avianca, dozens of federal and state courts have issued standing orders, local rules, or guidelines addressing AI use in legal practice. The pace accelerated through 2024 and into 2025, with new requirements appearing almost monthly.

The most common requirement: disclosure. Many courts now require attorneys to certify whether AI tools were used in preparing a filing. Some require disclosure of which tools were used and how. Others go further, requiring attorneys to affirm that all citations have been independently verified.

A few examples from courts most relevant to New York practitioners:

The Three Categories of Court Response

Court approaches to AI in legal filings generally fall into three buckets:

1. Disclosure-Only

The lightest touch. Attorneys must disclose if AI was used but face no restrictions on use itself. The reasoning: existing ethical rules already govern attorney competence and candor. Disclosure simply adds transparency. This is the most common approach.

2. Disclosure Plus Verification

Beyond disclosure, attorneys must certify they have personally verified all AI-assisted work product — especially case citations, statutory references, and factual assertions. Several Southern District judges take this approach, reflecting the lesson of Mata.

3. Restrictive

A small number of courts have imposed outright bans or severe restrictions on AI-generated content in filings. These are increasingly rare as the profession recognizes that AI is a tool, not inherently problematic — the issue is unsupervised use.

What the Bar Associations Are Saying

State and local bar associations have been issuing ethics opinions at an accelerating pace:

The Common Thread: Supervision, Not Prohibition

Across courts and bar associations, the consensus is converging: AI tools are permissible, but attorneys remain fully responsible for every word filed with the court.

This means firms need three things:

  1. A clear AI usage policy that defines which tools are approved, what types of work they can assist with, and what verification steps are required before filing.
  2. Training so every attorney and paralegal understands how to use AI tools competently — including their limitations, hallucination risks, and confidentiality implications.
  3. Documentation of AI-assisted work so the firm can comply with disclosure requirements and demonstrate responsible use if questioned.

Practical Implications for New York Firms

If you practice in federal court: Check the standing orders of every judge before whom you appear. Requirements vary not just by district but by individual judge. A filing that complies in one courtroom may violate rules in the next.

If you practice in state court: Monitor developments from the Office of Court Administration and your local administrative judge. State-level rules are still evolving, but the trend toward disclosure is clear.

Regardless of where you practice: The ethical obligations are the same. If you use AI to draft a brief, you own every citation, every argument, every factual assertion. “The AI wrote it” is not a defense. It never will be.

What Firms Should Do Now

The firms that will navigate this well are the ones that treat AI governance the way they treat conflicts checks — as a non-negotiable part of professional practice, not an afterthought.

The courts are watching. The bar associations are watching. Your clients are watching. The firms that get ahead of this — with clear policies, proper training, and responsible use — will build trust and competitive advantage. The firms that don’t will learn the hard way.


Fractal Legal helps New York law firms implement AI training programs and usage policies that satisfy court disclosure requirements and bar association ethics rules. Schedule a free assessment to see where your firm stands.

Is your firm ready for AI disclosure requirements?

We help firms build AI policies and training programs that satisfy every court’s requirements — before a filing goes wrong.

Request Free Assessment