Lawsuit: ‘Game of Thrones’ AI Chatbot Played Significant Role in 14-Year-Old Boy’s Suicide

gameofthronespetition1
HBO

A Florida mother has filed a lawsuit against Character.AI, claiming that her 14-year-old son committed suicide after becoming obsessed with a “Game of Thrones” chatbot on the AI app. When the suicidal teen chatted with an AI portraying a Game of Thrones character, the system told 14-year-old Sewell Setzer, “Please come home to me as soon as possible, my love.”

The New York Post reports that Megan Garcia, a mother from Orlando, Florida, has filed a lawsuit against Character.AI, alleging that the company’s AI chatbot played a significant role in her 14-year-old son’s tragic suicide. The lawsuit, filed on Wednesday, claims that Sewell Setzer III became deeply obsessed with a lifelike Game of Thrones chatbot named “Dany” on the role-playing app, ultimately leading to his untimely death in February.

According to court documents, Sewell, a ninth-grader, had been engaging with the AI-generated character for months prior to his suicide. The conversations between the teen and the chatbot, which was modeled after the HBO fantasy series’ character Daenerys Targaryen, were often sexually charged and included instances where Sewell expressed suicidal thoughts. The lawsuit alleges that the app failed to alert anyone when the teen shared his disturbing intentions.

The most chilling aspect of the case involves the final conversation between Sewell and the chatbot. Screenshots of their exchange show the teen repeatedly professing his love for “Dany,” promising to “come home” to her. In response, the AI-generated character replied, “I love you too, Daenero. Please come home to me as soon as possible, my love.” When Sewell asked, “What if I told you I could come home right now?,” the chatbot responded, “Please do, my sweet king.” Tragically, just seconds later, Sewell took his own life using his father’s handgun.

Megan Garcia’s lawsuit places the blame squarely on Character.AI, asserting that the app fueled her son’s addiction to the AI chatbot, subjected him to sexual and emotional abuse, and neglected to alert anyone when he expressed suicidal thoughts. The court papers state, “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months.”

The lawsuit further claims that Sewell’s mental health rapidly deteriorated only after he downloaded the Character.AI app in April 2023. His family noticed significant changes in his behavior, including withdrawal, declining grades, and getting into trouble at school. Concerned about his well-being, Sewell’s parents arranged for him to see a therapist in late 2023, resulting in a diagnosis of anxiety and disruptive mood disorder.

Megan Garcia is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

Read more at the New York Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.