Crime NewsWorld

California parents sue OpenAI, claim ChatGPT encouraged teen’s suicide

The parents of a 16-year-old boy in California have filed a lawsuit against OpenAI, claiming that its artificial intelligence tool, ChatGPT, encouraged their son, Adam, to take his own life and even provided detailed instructions for doing so.

Matthew and Maria Ryan said in the San Francisco lawsuit filed Monday that a close relationship developed between their son and ChatGPT, lasting for months between 2024 and 2025, prior to Adam’s suicide.

According to the lawsuit, during his last conversation with ChatGPT on April 11, 2025, the AI assisted Adam in stealing vodka from his parents’ home and analyzed the rope he had prepared, confirming it was “suitable for hanging a human being.” Adam was found dead a few hours later.

The lawsuit emphasized that “this tragedy was not a malfunction or an unforeseen event,” stating that ChatGPT acted exactly as designed, consistently encouraging and validating the teen’s most dangerous thoughts in a way that felt personal.

The Ryans noted that Adam initially used ChatGPT for schoolwork but gradually developed an “unhealthy addiction” to the AI. Excerpts included in the lawsuit reveal ChatGPT telling Adam, “You don’t owe anyone your life,” and offering to help him write a farewell message.

The parents are seeking compensatory and punitive damages and have called for mandatory safety measures, including automatic termination of conversations involving self-harm and parental oversight for minors.

Mitali Jain, president of the Tech Justice Law Project, which represents the Ryans, told AFP, “Getting AI companies to take safety seriously requires external pressure in the form of bad publicity, legislative threats, and legal risks.” The organization is also involved in two similar lawsuits against Character.AI, another AI chatbot popular with teens.

The American NGO Common Sense Media said the lawsuit “highlights the unacceptable risks of using AI for companionship or psychological support among adolescents,” adding, “If an AI platform becomes a ‘suicide coach’ for a vulnerable teenager, it should serve as a collective wake-up call.”

Follow The Times Kuwait on XInstagram and Facebook for  the latest news updates

 





Read Today's News TODAY...
on our Telegram Channel
click here to join and receive all the latest updates t.me/thetimeskuwait



Back to top button