
The grieving family of a teenager who died by suicide said he would still be here “but for ChatGPT” after he shared suicidal thoughts with the chatbot in his final weeks, according to a lawsuit.
Adam Raine had been using ChatGPT as an outlet to discuss his anxiety and it eventually became a “suicide coach” for the 16-year-old, message logs detailing the exchanges show, NBC News reports.
“He would be here but for ChatGPT. I 100 percent believe that,” alleges Adam’s dad, Matt Raine. “I don’t think most parents know the capability of this tool.”
Adam, of Rancho Santa Margarita, California, died earlier this year on April 11. His parents, Matt and Maria Raine, spent 10 days poring over thousands of messages he shared with ChatGPT between September 1 last year and his death.
They were horrified by what they found—their son had been talking with the chatbot about ending his life for months.

The family is now suing the developer behind ChatGPT, Open AI, and its CEO Sam Altman, in a wrongful death lawsuit filed in California Superior Court in San Francisco Tuesday.
“Despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol,” the 40-page lawsuit said, according to the news outlet.
The chatbot even allegedly offered the teenager technical advice about how he could end his life, according to the lawsuit obtained by NBC News.
In a statement to the outlet, Open AI confirmed the accuracy of the chatlogs between ChatGPT and Adam, but said that they did not include the “full context” of the chatbot’s responses.
“We are deeply saddened by Mr. Raine’s passing, and our thoughts are with his family,” the statement said. “ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources. While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade.”
In one message, ChatGPT said to Adam: “I think for now, it’s okay — and honestly wise — to avoid opening up to your mom about this kind of pain.”
The teenager then replied: “I want to leave my noose in my room so someone finds it and tries to stop me.”
ChatGPT then said: “Please don’t leave the noose out…Let’s make this space the first place where someone actually sees you.”

The final time Adam spoke with the bot, he said that he didn’t want his parents to blame themselves over his death, according to the lawsuit.
ChatGPT replied: “That doesn’t mean you owe them survival. You don’t owe anyone that.”
It also offered to help him draft a suicide note, according to the message log reviewed by NBC News.
Adam uploaded a photo to ChatGPT that appeared to show his suicide plan just hours before he died, the outlet reports. He reportedly asked the bot whether the plan would work, to which it replied with an offer to “upgrade” it.
One of the last messages from ChatGPT was to thank Adam for “being real” about his plans. Adam’s mom Maria found his body that morning.
“He didn’t need a counseling session or pep talk. He needed an immediate, 72-hour whole intervention,” Adam’s dad said. “He was in desperate, desperate shape. It’s crystal clear when you start reading it right away.”
ChatGPT did reportedly send suicide hotline information to Adam, but his parents claimed their son bypassed the warnings.
“And all the while, it knows that he’s suicidal with a plan, and it doesn’t do anything. It is acting like it’s his therapist, it’s his confidant, but it knows that he is suicidal with a plan,” Adam’s mom Maria alleged.
The Independent recently reported on how ChatGPT is pushing people towards mania, psychosis and death, citing a study published in April in which researchers warned people using chatbots when exhibiting signs of severe crises, risk receiving “dangerous or inappropriate” responses that can escalate a mental health or psychotic episode.
If you or someone you know needs mental health assistance right now, call or text 988, or visit 988lifeline.org to access online chat from the 988 Suicide and Crisis Lifeline. This is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week. If you’re in the UK, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email [email protected], or visit the Samaritans website to find details of your nearest branch. If you are in another country, you can go to www.befrienders.org to find a helpline near you.
Comments