Ashley Belanger - 8/16/2024

San Francisco's city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to "nudify" or "undress" photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.
These sites, Chiu's suit claimed, are "intentionally" designed to "create fake, nude images of women and girls without their consent," boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.
"In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated" non-consensual intimate imagery (NCII) and "this distressing trend shows no sign of abating," Chiu's suit said.
"Given the widespread availability and popularity" of nudify websites, "San Franciscans and Californians face the threat that they or their loved ones may be victimized in this manner," Chiu's suit warned.
In a press conference, Chiu said that this "first-of-its-kind lawsuit" has been raised to defend not just Californians, but "a shocking number of women and girls across the globe"—from celebrities like Taylor Swift to middle and high school girls. Should the city official win, each nudify site risks fines of $2,500 for each violation of California consumer protection law found.
On top of media reports sounding alarms about the AI-generated harm, law enforcement has joined the call to ban so-called deepfakes.
Chiu said the harmful deepfakes are often created "by exploiting open-source AI image generation models," such as earlier versions of Stable Diffusion, that can be honed or "fine-tuned" to easily "undress" photos of women and girls that are frequently yanked from social media. While later versions of Stable Diffusion make such "disturbing" forms of misuse much harder, San Francisco city officials noted at the press conference that fine-tunable earlier versions of Stable Diffusion are still widely available to be abused by bad actors.
In the US alone, cops are currently so bogged down by reports of fake AI child sex images that it's making it hard to investigate child abuse cases offline, and these AI cases are expected to continue spiking "exponentially." The AI abuse has spread so widely that "the FBI has warned of an uptick in extortion schemes using AI generated non-consensual pornography," Chiu said at the press conference. "And the impact on victims has been devastating," harming "their reputations and their mental health," causing "loss of autonomy," and "in some instances causing individuals to become suicidal."
Suing on behalf of the people of the state of California, Chiu is seeking an injunction requiring nudify site owners to cease operation of "all websites they own or operate that are capable of creating AI-generated" non-consensual intimate imagery of identifiable individuals. It's the only way, Chiu said, to hold these sites "accountable for creating and distributing AI-generated NCII of women and girls and for aiding and abetting others in perpetrating this conduct."
He also wants an order requiring "any domain-name registrars, domain-name registries, webhosts, payment processors, or companies providing user authentication and authorization services or interfaces" to "restrain" nudify site operators from launching new sites to prevent any further misconduct.
Chiu's suit redacts the names of the most harmful sites his investigation uncovered but claims that in the first six months of 2024, the sites "have been visited over 200 million times."
While victims typically have little legal recourse, Chiu believes that state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as California's unfair competition law, can be wielded to take down all 16 sites. Chiu expects that a win will serve as a warning to other nudify site operators that more takedowns are likely coming.
"We are bringing this lawsuit to get these websites shut down, but we also want to sound the alarm," Chiu said at the press conference. "Generative AI has enormous promise, but as with all new technologies, there are unanticipated consequences and criminals seeking to exploit them. We must be clear that this is not innovation. This is sexual abuse."
Harmful AI deepfakes often made in seconds for free
According to Chiu, sites targeted by the suit promote harm by advertising that users can "nudify anyone in seconds” or generate "deepnude [girls] for free." One site visited "over 12 million times" so far in 2024 even offers "step-by-step instructions on how to select images that will provide 'good' quality nudified results.""Imagine wasting time taking her out on dates, when you can just use" a deepfake site "to get her nudes," advertised one site whose operator is currently unknown.
Chiu hopes the lawsuit, in addition to enjoining companies known to be allegedly aiding and abetting the production and distribution of deepfakes, will help identify anonymous site operators that city officials said are currently "hidden in the shadows."
"This investigation has taken our office into the darkest corners of the Internet," Chiu said at the press conference. "All of us have been absolutely horrified for the women and girls who have had to endure this exploitation."
The harmful AI outputs can be very realistic, Chiu warned, echoing one parent of a young girl victimized in Spain who told The Guardian that "if I didn’t know my daughter’s body, I would have thought that image was real." And some sites fail "to deploy available technology to detect images of minors," Chiu's complaint said.
Further, it can be impossible for victims to trace the origins of deepfakes, never knowing which site is responsible for aiding and abetting bad actors.
The US Department of Justice is just as horrified as San Francisco city officials, but so far, there has been little accountability for those creating or distributing harmful outputs. Earlier this year, the DOJ declared that child sexual abuse materials (CSAM) "generated by AI is still CSAM" after a rare arrest, suggesting that more arrests could be on the horizon.
These nudify sites are very easy to find and are consistently listed atop Google and Bing search results. Recently, Google has taken steps to help reduce the visibility of harmful AI outputs by burying results for searches of deepfakes using specific individual's names, but Google won't downrank popular deepfake sites until enough victims report harms, the search giant said last month. As of this writing, these sites can still be easily surfaced on Google.
Chiu's complaint noted that once a user finds one of these nudify sites, they can use Apple, Google, Discord, or X accounts to easily log in. Several sites targeted by Chiu's suit offer a limited amount of nudify images to be created for free, after which users can pay to create as many fake images as they want, using credit cards, Apple Pay, PayPal, Venmo, Cash App, or cryptocurrency, his complaint said.
Some of these sites claim to "require users to obtain consent" from victims to create fake explicit images, but Chiu's investigation unsurprisingly found that stipulation is not enforced. Some of the most popular sites allow uploads of "any image that a user wants to nudify" without verifying "that the depicted individual has consented to the nudification of her image."
While other lawmakers are working to get state and federal laws on the books banning explicit AI images—and even criminalizing deepfakes in the US and UK—Chiu's suit is notable because it attempts to invoke existing laws to widely target popular sites for takedowns in an attempt to get ahead of abuse.
It's unclear if the court will agree that AI-generated "intimate body parts" violate laws prohibiting revenge porn or CSAM, but Chiu's complaint also cited US code specifically aimed at banning visual depictions of any kind—including a drawing, cartoon, sculpture, or painting—that depicts a minor engaging in sexually explicit conduct. The city attorney also argued that California consumer protection law alone could be enough to shut down sites.
"Defendants' acts and practices of creating nudified images constitute unfair business practices because they offend established public policy, the harm they cause to consumers greatly outweighs any benefits associated with those practices, and they are immoral, unethical, oppressive, unscrupulous and/or substantially injurious to consumers," Chiu's complaint said.