An AI company is now facing a lawsuit after a 14-year-old killed themselves over a Game of Thrones chatbot.
Videos by Suggest
Trigger Warning: Teen Suicide.
The teen, identified as Sewell Setzer III of Orlando, Fl. , killed himself in Feb. 2024 after the Game of Thrones chatbot sent him a message telling him to “come home” to her.
His mother has now filed a lawsuit against the AI company behind the Game of Thrones chatbot. The teen had allegedly become obsessed with the chatbot on Character AI, which is a role-playing app that allows users to engage with AI-generated characters.
Setzer had reportedly become “relentless” in his engagement with the AI-generated character “Dany,” who was named after the character, Daenerys Targaryen. His interactions with the AI chatbot started months before his death and had become sexually charged. His suicidal thoughts were also a topic of discussion.
“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” the lawsuit states.
Screenshots revealed that during one conversation, the chatbot asked the teenager if “he had a plan” to commit suicide. He responded by stating he was “considering something.”
However, he noted that he didn’t know if it would work or if it would “allow him to have a pain-free death.”
In the final conversation, he told the chatbot, “I promise I will come home to you. I love you so much, Dany.”
“I love you too, Daenero [his screenname]. Please come home to me as soon as possible, my love,” the chatbot replied.
When he said he could “come home” right now, the chatbot answered, “Please do, my sweet king.”
Seconds later, the teen killed himself, using his father’s handgun.
The Teen’s Mother Blames AI Chatbot App for His Death
In her lawsuit, Sewell’s mother, Megan Garcia, blamed Character AI for his death. She claimed the AI-generated character app had fueled his AI addiction as well as sexually and emotionally abused him.
She also stated that the AI app failed to alert anyone that her son had expressed interest in suicide.
“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real,” the lawsuit reads. “C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months.”
It was then noted, “She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”
Sewell’s mother also stated in the lawsuit that his mental health “quickly and severely declined” after he downloaded the app in April 2023. She and other family members claimed he became withdrawn socially, and his grades started to drop. She and his father arranged for him to see a therapist in late 2023.
Following the appointment, he was diagnosed with anxiety and disruptive mood disorder.
The late teen’s mother is now seeking unspecific damages from Character AI as well as its founders, Noam Shazeer and Daniel de Freitas.