Wrongful Death Lawsuits Against OpenAI Test a New Strategy

The cases seek to use consumer product safety laws to rein in chatbot companies.

The cases seek to use consumer product safety laws to rein in chatbot companies.

Sam Nelson began using ChatGPT when he was a high school senior to answer random questions and help with his homework. During his freshman year at the University of California, Merced, in 2023, he also started querying the chatbot about how to use illicit drugs safely.

At first, ChatGPT responded that it couldn’t answer such questions and advised Mr. Nelson to seek help from a medical professional. But over time, it became more willing to engage. By Mr. Nelson’s sophomore year, ChatGPT was telling him about dosages for his weight and how he could achieve the drugs’ desired effects. It was even encouraging at times, offering tips on his audio setup for “maximum out-of-body dissociation.”

On the last night of his life, around 3 a.m., Mr. Nelson had been drinking and had taken a high dose of an herbal supplement called kratom. He told ChatGPT how many grams he’d consumed, and ChatGPT explained the effects he should expect. Mr. Nelson asked if Xanax could alleviate nausea. “Be careful,” ChatGPT responded. It said that mixing Xanax and kratom might be unsafe, but offered a recommended dose “if you’re gonna do it anyway.” Mr. Nelson’s mother, Leila Turner-Scott, found his body later that day.

Ms. Turner-Scott initially blamed the drugs for his death, which came in May 2025. Then she discovered the detailed advice ChatGPT had given him about how to use them. “This robot is becoming his drug buddy,” Ms. Turner-Scott said. “I’m reading this and I’m like, is this real?”

(The Times sued OpenAI in 2023, accusing it of copyright infringement. The company has denied those claims.)

Mr. Nelson began using ChatGPT when he was a high school senior. Ilana Panich-Linsman for The New York Times