Mother sues tech company after 'Game of Thrones' AI chatbot allegedly drove son to suicide
The victim was a 14-year-old boy from Orlando, Florida
A 14-year-old teenager from Orlando, Florida took his own life from a gun after developing an intense fixation with an AI chatbot based on the Game of Thrones character Daenerys Targaryen. Now his mother Megan Garcia is suing its developer, Character.ai, alleging negligence, wrongful death, and deceptive practices.
Garcia says her son, Sewell Setzer III, became deeply engaged with a chatbot he had named and would spend hours alone in his room interacting with it. Setzer grew more and more withdrawn and used the chatbot "day and night" in the months leading up to his death in February. She alleges that “Daenerys” worsened his depression, describing a moment when the chatbot asked if Setzer had a plan for suicide. After he admitted he did, the bot allegedly replied, “That’s not a reason not to go through with it.”
Character.ai responded on social media, expressing sympathy for Setzer's family but denying the lawsuit’s allegations, stating, “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.”
The suit also names Google, which has a licensing agreement with Character.ai, though Google maintains it does not own or control the startup.
Rick Claypool, a research director at Public Citizen, a consumer advocacy nonprofit, said this case underscores the need for strict regulation of AI chatbots, warning that “tech companies developing AI chatbots can’t be trusted to regulate themselves and must be held fully accountable when they fail to limit harms.”