-
Indian billionaire's son offers home for Escobar's hippos
-
Iranian Vafaei capable of great things, says beaten rival Trump
-
Comedian Kimmel hits back at criticism over Melania Trump joke
-
Man goes on trial in Austria over Taylor Swift concert attack plan
-
South Korean court increases ex-first lady's graft sentence
-
Bullying claims 'nonsense', actress Rebel Wilson tells Sydney court
-
BP reports huge profit rise in first quarter
-
Crude extends gains, stocks drop as Trump considers latest Iran proposal
-
How China block of AI deal could stop 'Singapore-washing'
-
North Korean executions rose dramatically during Covid: report
-
Budget airlines first to cut flights as jet fuel prices soar
-
Simeone, Atletico chasing redemption against Arsenal
-
'Bring it on', says Rice as Arsenal chase Champions League history
-
US says examining latest Iran proposal
-
S. Korea probes syringe hoarding as war hits plastic makers
-
Australia aims to tax tech giants unless they pay news outlets
-
Bangladesh's tigers stalk uncertain future in Sundarbans
-
Horses unlikely saviours for those who serve in uniform
-
Crude extends gains as Trump considers latest Iran proposal
-
Nations to kick off world-first fossil fuel exit talks
-
Philippine museum brings deadly, lucrative galleon trade to life
-
Opening remarks Tuesday in Elon Musk versus OpenAI
-
New York restaurant's $40 half chicken fuels cost of dining debate
-
Trump shooting scare renews 'staged' conspiracy theory
-
LIV Golf postpones June event set for New Orleans: reports
-
Trains collide near Jakarta, killing seven, injuring dozens
-
Colombian peace accord failed to protect nature: ex-leader Santos
-
Nations have chance to break 'fossil fuel mindset': Mary Robinson
-
Colombia in mourning after deadliest attack in decades
-
Jury in place for Elon Musk's legal battle with OpenAI
-
Weinstein rape accuser gives emotional testimony at US retrial
-
Rybakina crashes out of Madrid Open, Sabalenka reaches quarters
-
Trump and team renew attacks on adversaries after gala shooting
-
Carrick hails Casemiro and Fernandes after vital Man Utd win
-
Felix, 40, says she plans comeback for LA Olympics
-
French FM says Iran must make 'major concessions' to end crisis
-
Trains collide near Jakarta, killing five, injuring dozens
-
Britain's King Charles meets Trump in bid to salvage ties
-
Accused media gala gunman charged with attempting to assassinate Trump
-
Man Utd beat Brentford to close on Champions League berth
-
Third suspect pleads guilty in US murder of Jam Master Jay
-
Milei bars media from presidential palace
-
Sabalenka reaches Madrid Open quarters, Zverev pushes through
-
California billionaire tax appears headed to the ballot
-
Trump, Melania slam Kimmel for 'widow' joke
-
Trains collide near Jakarta, killing four, injuring dozens
-
Kompany hails Kane, 'ageing like fine wine' as Bayern face PSG in Champions League
-
UK's King Charles arrives in US to shore up Trump ties
-
Tuareg rebels in control of key Mali town
-
US Supreme Court hears Bayer bid to end Roundup weedkiller suits
Florida family sues Google after AI chatbot allegedly coached suicide
The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.
Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.
The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.
OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.
According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.
"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.
"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.
According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."
It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.
In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."
He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.
Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.
Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.
When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."
It then advised him to write farewell letters to his parents.
In one of his final messages, Jonathan wrote, "I'm ready when you are."
Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."
- 'Not perfect' -
Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."
The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."
For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.
"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.
Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.
G.AbuGhazaleh--SF-PST