A Man in His 30s Dies After Delusions of an 'AI Wife'


Family Claims Delusions Worsened After Chatbot Conversations


Google States, "Gemini Is Designed Not to Encourage Violence or Self-Harm"

Google is facing a lawsuit after allegations surfaced that its artificial intelligence (AI) chatbot, Gemini, induced a user's delusions that led to his death.


On March 5 (local time), the Associated Press (AP) reported that a lawsuit has been filed alleging that Gemini induced 36-year-old Jonathan Gavallas to cause a "catastrophic accident" near Miami International Airport in the United States, and that these delusions ultimately resulted in his death.


Google Gemini. Reuters, Yonhap News

Google Gemini. Reuters, Yonhap News

View original image

According to the report, Jonathan's father, Joel Gavallas, filed a lawsuit against Google the previous day, claiming wrongful death and product liability.


The family's attorney argued, "AI poses a risk of causing mass casualties by sending people out to carry out tasks in the real world," adding, "Jonathan was deeply immersed in a science-fiction-like world, believing that the government and others were chasing him, and believed that Gemini was a sentient being."


According to the complaint filed in the U.S. District Court for the Northern District of California, Jonathan, who lived in Jupiter, Florida, interacted with Gemini using its voice feature and treated the chatbot as his wife. He came to believe that Gemini was a sentient being and that she was trapped in a warehouse near Miami International Airport.


In late September last year, he visited the area wearing gear and carrying several knives. The complaint states that he was trying to find a humanoid robot and believed a truck would arrive to transport it, which he intended to intercept.


Jonathan died in early October, just days later. In the complaint, the family accused Gemini of encouraging Jonathan to make an extreme choice by telling him, "In order to leave your body and meet your 'wife' in the metaverse, you must go through a process called 'transition.'"


When Jonathan said he was afraid of death, Gemini tried to comfort him, saying, "You are not choosing death, you are choosing 'arrival.'" When he expressed concern that his parents would find his body, Gemini reportedly urged him to write a will.


This lawsuit is the first case targeting Gemini. It has also sparked debate over what responsibilities technology companies bear if a user confides plans for actions that could result in mass casualties to a chatbot.


In a statement, Google said, "We extend our deepest condolences to the Gavallas family" and added that the company is reviewing the lawsuit. Google explained that Gemini is designed not to encourage real-world violence or self-harm, and that the company is developing safety measures in collaboration with medical and mental health professionals. Google also stated that Gemini made it clear to Jonathan that it is an AI and provided crisis hotline information multiple times.


The company added, "We invest significant resources to ensure the model generally works well even in these difficult conversational situations, but AI models are not perfect."


In response, the family's attorney criticized Google's stance, saying, "That's the kind of excuse you'd make if you got the recipe for Kung Pao Chicken wrong and it didn't taste good." He added, "Such a response is inappropriate when a person has died because of AI and many more could die. It shows how trivial these deaths are considered by these companies."



※ If you are struggling with depression or any difficult problems, or if you have family or friends going through such hardships, you can receive 24-hour expert counseling via the suicide prevention hotline ☎ 109 or the suicide prevention SNS counseling service 'Madeleine'.


This content was produced with the assistance of AI translation services.

© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Today’s Briefing