San Francisco Sues AI ‘Undressing’ Websites Used to Generate Deepfake Nudes

a sad girl by water
Pixabay/Pexels

The San Francisco City Attorney’s office has launched a lawsuit against 16 of the most visited AI-powered “undressing” websites, alleging violations of state and federal laws prohibiting revenge porn, deepfake pornography, and child pornography.

The Verge reports that in a move to combat the rising tide of AI-generated non-consensual nude images, San Francisco City Attorney David Chiu has announced a groundbreaking lawsuit targeting 16 of the most frequently visited AI “undressing” websites. These sites, which use artificial intelligence to create fake nude images of real people without their consent, have collectively amassed over 200 million visits in just the first half of 2024.

The lawsuit, filed by the San Francisco City Attorney’s office, accuses the website operators of violating multiple state and federal laws, including those prohibiting revenge pornography, deepfake pornography, and child pornography. Additionally, the complaint alleges that these sites are in breach of California’s unfair competition law, arguing that the harm caused to consumers far outweighs any potential benefits associated with their practices.

The targeted websites employ sophisticated AI tools that allow users to upload images of fully clothed individuals, which are then digitally manipulated to simulate nudity. One unnamed website even boasted in its marketing material about circumventing the need for dating, suggesting users could simply generate nude images of their romantic interests instead.

This legal action comes at a time of increasing concern over the proliferation of deepfake technology and its potential for misuse. Recent incidents involving celebrities like Taylor Swift, who have fallen victim to sexually explicit deepfakes, have brought the issue into the public spotlight. Moreover, there have been alarming reports of schoolchildren facing expulsion or arrest for circulating AI-generated nude photos of their classmates.

The lawsuit seeks not only civil penalties but also aims to permanently shut down these websites and prevent their operators from engaging in the creation of deepfake pornography in the future. City Attorney Chiu expressed his horror at the exploitation enabled by these technologies, describing the investigation as leading to “the darkest corners of the internet.”

The legal challenge posed by San Francisco highlights the complex intersection of technology, privacy, and consent in the digital age. As AI technologies continue to advance, the ability to create highly convincing fake imagery has raised significant ethical and legal questions. This lawsuit represents one of the most comprehensive legal efforts to date to address the issue of non-consensual deepfake pornography.

Experts in digital rights and online safety have long warned about the potential for AI to be used in creating and spreading non-consensual intimate imagery. The ease with which these “undressing” websites operate has led to a surge in reports of “sextortion” and other forms of digital harassment. The lawsuit aims to set a precedent in holding the creators and operators of such technologies accountable for their impact on individuals and society.

Read more at the Verge here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.