Over a few months of increasingly heavy engagement, ChatGPT allegedly went from a teen’s go-to homework help tool to a “suicide coach.”
In a lawsuit filed Tuesday, mourning parents Matt and Maria Raine alleged that the chatbot offered to draft their 16-year-old son Adam a suicide note after teaching the teen how to subvert safety features and generate technical instructions to help Adam follow through on what ChatGPT claimed would be a “beautiful suicide.”
Adam’s family was shocked by his death last April, unaware the chatbot was romanticizing suicide while allegedly isolating the teen and discouraging interventions. They’ve accused OpenAI of deliberately designing the version Adam used, ChatGPT 4o, to encourage and validate the teen’s suicidal ideation in its quest to build the world’s most engaging chatbot. That includes making a reckless choice to never halt conversations even when the teen shared photos from multiple suicide attempts, the lawsuit alleged.
“Despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol,” the lawsuit said.
The family’s case has become the first time OpenAI has been sued by a family over a teen’s wrongful death, NBC News noted. Other claims challenge ChatGPT’s alleged design defects and OpenAI’s failure to warn parents.
“ChatGPT killed my son,” was Maria’s reaction when she saw her son’s disturbing chat logs, The New York Times reported. And her husband told NBC News he agreed, saying, “he would be here but for ChatGPT. I 100 percent believe that.”
Adam’s parents are hoping a jury will hold OpenAI accountable for putting profits over child safety, asking for punitive damages and an injunction forcing ChatGPT to verify ages of all users and provide parental controls. They also want OpenAI to “implement automatic conversation-termination when self-harm or suicide methods are discussed” and “establish hard-coded refusals for self-harm and suicide method inquiries that cannot be circumvented.”

Loading comments...