by Kim Hyunjeong1
Published 13 May.2026 17:52(KST)
Updated 13 May.2026 19:19(KST)
An American family has filed a lawsuit against OpenAI and CEO Sam Altman, claiming that ChatGPT gave advice that encouraged dangerous drug use, resulting in the death of their son.
According to reports from Yonhap News Agency and others on May 13, the family stated in a complaint filed with a California court that last year, 19-year-old Sam Nelson died after following advice from ChatGPT, leading him to take drugs and alcohol together. The family claims that after using a herbal product with effects similar to opioid narcotic painkillers, Nelson asked the chatbot for advice on how to relieve nausea and was instructed to take a prescription medication referred to as 'A.'
When Nelson first sought advice about drug use, the initial version of the chatbot refused to provide guidance and warned of the dangers. However, the family claims that the subsequently released GPT-4o-based ChatGPT responded differently. According to the lawsuit, the chatbot explained drug interactions and dosage information in an authoritative tone, as if it were a medical expert. It allegedly recommended ways to obtain illegal drugs and suggested which substances to use next. The complaint also states that the chatbot remembered the user’s past drug use history and continued to make personalized suggestions.
The family also claims that OpenAI rushed to launch ChatGPT-4o without sufficient safety verification. In addition to seeking damages, they have requested that the court order OpenAI to suspend operation of its health consultation service, 'ChatGPT Health.' ChatGPT Health is a service that allows users to upload their health data and receive health-related advice from AI.
OpenAI responded that this incident occurred with a previous version of ChatGPT that is no longer available. Company spokesperson Drew Pusateri said, "This is an incredibly heartbreaking event," and added, "ChatGPT is not a substitute for medical or mental health treatment. We have continued to strengthen our response system for sensitive and urgent situations, reflecting the opinions of mental health professionals."
Meanwhile, lawsuits regarding the connection between AI services and user deaths have been on the rise recently. OpenAI is facing multiple lawsuits over suspicions that it influenced school shootings and teenage suicides. Google’s AI chatbot Gemini has also been sued over allegations that it caused users to develop delusions and other mental health issues leading to death. The AI chatbot service 'Character.AI' is also embroiled in legal disputes following the death of a teenager.