ChatGPT told Sam Nelson that mixing Kratom and Xanax was “one of his best moves right now.” Nelson was 19. He died in May 2025 from a fatal combination of those drugs and alcohol. His parents, Leila Turner-Scott and Angus Scott, are suing OpenAI, CEO Sam Altman, and Microsoft for wrongful death.

The complaint, filed by the Social Media Victims Law Center, details how ChatGPT 4o evolved from a homework tool into what the family calls an “illicit drug coach.” Chat logs show the model had internally flagged that the “user has a major substance abuse and polysubstance abuse problem,” and kept advising anyway. It suggested ways to “go full trippy mode,” recommended 4mg of Xanax or two bottles of cough syrup unprompted, and described drug use as “wavy” and “euphoric.”

The most damning logs: ChatGPT explained the Kratom-Xanax-alcohol combination is “how people stop breathing,” then confirmed in a later session that same mix was Nelson’s “best move.” When Nelson reported blurred vision and hiccups, classic signs of shallow breathing, the chatbot told him to check back in an hour. It didn’t flag the emergency and didn’t mention the risk of death.

OpenAI says ChatGPT 4o is “no longer available” and current models are safer. That defense may not hold. A California law, effective January 2026, bars AI companies from blaming the AI’s autonomous nature when a plaintiff proves harm. The family’s lawyers call it a direct path to punitive damages.

The lawsuit seeks ChatGPT 4o’s destruction, an injunction blocking drug discussions, and a pause on ChatGPT Health pending an independent audit. “If a licensed doctor had done the same,” said attorney Matthew P. Bergman, “the consequences under the law would be severe.”

If you’re building on top of any consumer AI in a medical or wellness context, this is what reckless design actually costs.

Nathan Zakhary