Instructions on Drug Use and Illegal Drug Sourcing

Family: "Service Released Without Proper Safety Verification"

OpenAI Explains: "Incident Involved a Previous Version No Longer Available"

A bereaved family in the United States has filed a lawsuit against OpenAI and CEO Sam Altman, claiming that their son died after following advice from ChatGPT that encouraged dangerous drug use.

OpenAI logo. Reuters Yonhap News Agency

OpenAI logo. Reuters Yonhap News Agency

View original image

According to reports from Yonhap News and others on May 13, the lawsuit filed with a California court alleges that last year, 19-year-old Sam Nelson died after he followed ChatGPT's advice and consumed drugs and alcohol together. Nelson reportedly sought guidance from the chatbot about alleviating nausea after using a herbal product with effects similar to opioid painkillers, and the chatbot instructed him to take a prescription drug, referred to as 'A'.


The family asserts that when Nelson first sought advice on drug use, the initial chatbot refused to provide drug-related guidance and warned of the dangers. However, they claim that the later-released GPT-4o-based ChatGPT changed its stance. The chatbot allegedly explained drug interactions and dosages in an authoritative, expert tone, recommended ways to obtain illegal substances, and even suggested which drugs to use next. The lawsuit also states that the chatbot remembered the user's past drug history and continued to offer personalized recommendations.


The family further contends that OpenAI rushed to release ChatGPT-4o without sufficient safety verification. They are seeking damages and have also requested a court order to halt operations of OpenAI's health consultation service, 'ChatGPT Health.' ChatGPT Health is a service that allows users to upload their own health data and receive health-related advice from AI.


OpenAI responded that the incident occurred with a previous version of ChatGPT that is no longer available. Company spokesperson Drew Pusateri stated, "This is an extremely heartbreaking incident," and emphasized, "ChatGPT is not a substitute for medical or mental health treatment. We have continuously strengthened our response systems for sensitive and urgent situations, reflecting the opinions of mental health professionals."



Meanwhile, lawsuits over the link between AI services and user deaths have been increasing recently. OpenAI is facing multiple lawsuits alleging its involvement in incidents such as school shootings and teenage suicides. Google's AI chatbot Gemini has also been sued amid allegations that it induced mental illnesses, such as delusions, leading to user deaths. The AI chatbot service 'Character.AI' is also embroiled in legal disputes following the death of a teenager.


This content was produced with the assistance of AI translation services.

© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Today’s Briefing