Book a demo

14 May 2026

Improper validation leads to FDA’s first AI warning letter 

Author: Tristan Worden

Reviewed by: Art Gehring

Last updated: May 14, 2026

FDA first AI warning letter

On April 2, 2026, FDA issued a warning letter to Purolea Cosmetics Lab that marks a regulatory first. Among numerous Current Good Manufacturing Practice (CGMP) violations, FDA cited the inappropriate use of artificial intelligence (AI) in pharmaceutical manufacturing — the first time this has appeared as a named violation in an FDA warning letter. 

The details are straightforward, and worth reading carefully. 

During an inspection of Purolea’s facility in Livonia, Michigan, FDA investigators found that the company used AI agents to generate drug product specifications, procedures, and master production and control records.  

FDA’s position is unambiguous: if you use AI as an aid in document creation, you must review the AI-generated documents to ensure they are accurate and actually compliant with CGMP. Failure to do so is a violation of 21 CFR 211.22(c). 

The more striking finding was this: when FDA investigators informed Purolea that process validation had not been completed prior to distribution — a clear CGMP requirement — the company responded that they were unaware of the requirement. Their AI agent, they said, had never told them it was required. 

That explanation captures the core problem. AI can accelerate document creation, synthesize regulatory language, and support a validation program. It cannot substitute for the human oversight, subject matter expertise, and quality unit accountability that CGMP demands. The firm’s quality unit failed to review AI-generated outputs, failed to validate their accuracy, and — critically — failed to apply independent regulatory judgment. The AI did not cause those failures. Absent governance did. 

Kneat’s experts have put together a guide to protect compliance while using AI in your manufacturing. Download The Five Pillars of AI Governance and avoid your own warning letters. 

This is not an argument against AI in manufacturing. FDA itself acknowledged that firms may use AI to support CGMP activities, including developing procedures and specifications. Any output or recommendations from an AI agent must be reviewed and cleared by an authorized human representative of the firm’s quality unit. That is not a burden unique to AI — it is the same standard applied to any document, tool, or process in a regulated environment. 

The lesson is not to avoid AI. It is to govern it. Validated workflows, qualified personnel, and documented oversight are what separate AI as a compliance asset from AI as a compliance liability. For manufacturers already operating under GAMP 5, 21 CFR Part 11, and current FDA Computer Software Assurance (CSA) guidance, this warning letter reinforces a principle that should already be in place: no automated output enters a GxP process without human review and accountability. 

The Purolea letter will be cited for years. Let it be cited as the case that clarified the boundary — not as a cautionary tale that discouraged innovation, but as the moment the industry defined what responsible AI use in manufacturing actually looks like. 

Kneat is the industry’s most trusted digital validation platform, now with AI. As the trusted system of record with built in human-in-the-loop oversight, Kneat AI empowers compliant validation through the entire manufacturing lifecycle. Learn more here

Written By

Tristan Worden

Senior Content Marketing Strategist, Kneat

Tristan has over a decade of experience communicating complex ideas to both general and specific audiences with almost five years in the compliance and validation sector. He is passionate about ensuring validation professionals have the information they need to do their jobs effectively.

Revolutionize your validation

Digitalize validation your way, with the validation platform trusted by the world’s leading life sciences companies.

Book a demo