The moment a patient uploads their medical record to your AI health platform, HIPAA protection evaporates. Federal law only covers data held by healthcare providers and health plans. Consumer AI tools don’t qualify.

The 21st Century Cures Act built this pipeline deliberately. Patients now have the legal right to download and transfer their electronic health records, backed by standardized APIs that make it technically trivial. As of February 2026, over 1,600 complaints have been filed with the Information Blocking Complaint Portal against providers trying to slow that transfer down. Enforcement is building.

What nobody designed for: a single LLM-based health platform now needs to comply with six distinct regulatory frameworks at once, including FTC deceptive practices rules, Illinois restrictions on AI in mental health contexts, California’s medical information confidentiality law, Washington’s My Health My Data Act, Texas genomic privacy law, and youth protection requirements when records signal a minor’s age. Miss one and you’re exposed in that jurisdiction.

There’s also an auxiliary data problem: medical records contain information about people who never consented to anything, including children listed as dependents, provider notes and tax identifiers, and proprietary clinical protocols. Traditional consent frameworks assume one data subject. LLMs don’t.

When AI flags evidence-based off-label prescribing as an error, it creates fresh liability questions the platform has to answer.

One practical rule: if your platform publicly promises HIPAA-level protections for non-covered data, regulators will treat that promise as enforceable. Document what you’re promising before the next product launch.

— Nathan Zakhary