AI Girlfriends

Lawsuit: ‘Game of Thrones’ AI Chatbot Played Significant Role in 14-Year-Old Boy’s Suicide

A Florida mother has filed a lawsuit against Character.AI, claiming that her 14-year-old son committed suicide after becoming obsessed with a “Game of Thrones” chatbot on the AI app. When the suicidal teen chatted with an AI portraying a Game of Thrones character, the system told 14-year-old Sewell Setzer, “Please come home to me as soon as possible, my love.”

gameofthronespetition1

Pervert’s Paradise: Porn Industry Races to Capitalize on AI Video Generation

As AI image generators trained on pornographic content promise to revolutionize adult entertainment with custom “dream girls,” they also raise concerns around consent, likeness rights, income loss for performers, and preventing abusive depictions. As one industry insider explains, “This technology will catch on, and it will get abusive before it gets helpful.”

AI Girlfriend Lexi Love (@IntriguePublications/@LexiLove.x)

Privacy Experts: AI Girlfriends Are a Data Harvesting Nightmare

AI developers creating romantic chatbots to serve as AI girlfriends and boyfriends for lonely people are able to harvest an entirely new set of data from unsuspecting users, as the bots collect details far more personal than a typical app. A privacy expert studying the aps says, “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Digi AI girlfriend