Law Popular AI “nudify” sites sued amid shocking rise in victims globally - “Nudify” sites may be fined for making it easy to “see anyone naked,” suit says.


Ashley Belanger - 8/16/2024

GettyImages-2150655929-800x532.jpg

San Francisco's city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to "nudify" or "undress" photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.

These sites, Chiu's suit claimed, are "intentionally" designed to "create fake, nude images of women and girls without their consent," boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.

"In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated" non-consensual intimate imagery (NCII) and "this distressing trend shows no sign of abating," Chiu's suit said.

"Given the widespread availability and popularity" of nudify websites, "San Franciscans and Californians face the threat that they or their loved ones may be victimized in this manner," Chiu's suit warned.

In a press conference, Chiu said that this "first-of-its-kind lawsuit" has been raised to defend not just Californians, but "a shocking number of women and girls across the globe"—from celebrities like Taylor Swift to middle and high school girls. Should the city official win, each nudify site risks fines of $2,500 for each violation of California consumer protection law found.

On top of media reports sounding alarms about the AI-generated harm, law enforcement has joined the call to ban so-called deepfakes.

Chiu said the harmful deepfakes are often created "by exploiting open-source AI image generation models," such as earlier versions of Stable Diffusion, that can be honed or "fine-tuned" to easily "undress" photos of women and girls that are frequently yanked from social media. While later versions of Stable Diffusion make such "disturbing" forms of misuse much harder, San Francisco city officials noted at the press conference that fine-tunable earlier versions of Stable Diffusion are still widely available to be abused by bad actors.

In the US alone, cops are currently so bogged down by reports of fake AI child sex images that it's making it hard to investigate child abuse cases offline, and these AI cases are expected to continue spiking "exponentially." The AI abuse has spread so widely that "the FBI has warned of an uptick in extortion schemes using AI generated non-consensual pornography," Chiu said at the press conference. "And the impact on victims has been devastating," harming "their reputations and their mental health," causing "loss of autonomy," and "in some instances causing individuals to become suicidal."

Suing on behalf of the people of the state of California, Chiu is seeking an injunction requiring nudify site owners to cease operation of "all websites they own or operate that are capable of creating AI-generated" non-consensual intimate imagery of identifiable individuals. It's the only way, Chiu said, to hold these sites "accountable for creating and distributing AI-generated NCII of women and girls and for aiding and abetting others in perpetrating this conduct."

He also wants an order requiring "any domain-name registrars, domain-name registries, webhosts, payment processors, or companies providing user authentication and authorization services or interfaces" to "restrain" nudify site operators from launching new sites to prevent any further misconduct.

Chiu's suit redacts the names of the most harmful sites his investigation uncovered but claims that in the first six months of 2024, the sites "have been visited over 200 million times."

While victims typically have little legal recourse, Chiu believes that state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as California's unfair competition law, can be wielded to take down all 16 sites. Chiu expects that a win will serve as a warning to other nudify site operators that more takedowns are likely coming.

"We are bringing this lawsuit to get these websites shut down, but we also want to sound the alarm," Chiu said at the press conference. "Generative AI has enormous promise, but as with all new technologies, there are unanticipated consequences and criminals seeking to exploit them. We must be clear that this is not innovation. This is sexual abuse."

Harmful AI deepfakes often made in seconds for free​

According to Chiu, sites targeted by the suit promote harm by advertising that users can "nudify anyone in seconds” or generate "deepnude [girls] for free." One site visited "over 12 million times" so far in 2024 even offers "step-by-step instructions on how to select images that will provide 'good' quality nudified results."

"Imagine wasting time taking her out on dates, when you can just use" a deepfake site "to get her nudes," advertised one site whose operator is currently unknown.

Chiu hopes the lawsuit, in addition to enjoining companies known to be allegedly aiding and abetting the production and distribution of deepfakes, will help identify anonymous site operators that city officials said are currently "hidden in the shadows."

"This investigation has taken our office into the darkest corners of the Internet," Chiu said at the press conference. "All of us have been absolutely horrified for the women and girls who have had to endure this exploitation."

The harmful AI outputs can be very realistic, Chiu warned, echoing one parent of a young girl victimized in Spain who told The Guardian that "if I didn’t know my daughter’s body, I would have thought that image was real." And some sites fail "to deploy available technology to detect images of minors," Chiu's complaint said.

Further, it can be impossible for victims to trace the origins of deepfakes, never knowing which site is responsible for aiding and abetting bad actors.

The US Department of Justice is just as horrified as San Francisco city officials, but so far, there has been little accountability for those creating or distributing harmful outputs. Earlier this year, the DOJ declared that child sexual abuse materials (CSAM) "generated by AI is still CSAM" after a rare arrest, suggesting that more arrests could be on the horizon.

These nudify sites are very easy to find and are consistently listed atop Google and Bing search results. Recently, Google has taken steps to help reduce the visibility of harmful AI outputs by burying results for searches of deepfakes using specific individual's names, but Google won't downrank popular deepfake sites until enough victims report harms, the search giant said last month. As of this writing, these sites can still be easily surfaced on Google.

Chiu's complaint noted that once a user finds one of these nudify sites, they can use Apple, Google, Discord, or X accounts to easily log in. Several sites targeted by Chiu's suit offer a limited amount of nudify images to be created for free, after which users can pay to create as many fake images as they want, using credit cards, Apple Pay, PayPal, Venmo, Cash App, or cryptocurrency, his complaint said.

Some of these sites claim to "require users to obtain consent" from victims to create fake explicit images, but Chiu's investigation unsurprisingly found that stipulation is not enforced. Some of the most popular sites allow uploads of "any image that a user wants to nudify" without verifying "that the depicted individual has consented to the nudification of her image."

While other lawmakers are working to get state and federal laws on the books banning explicit AI images—and even criminalizing deepfakes in the US and UK—Chiu's suit is notable because it attempts to invoke existing laws to widely target popular sites for takedowns in an attempt to get ahead of abuse.

It's unclear if the court will agree that AI-generated "intimate body parts" violate laws prohibiting revenge porn or CSAM, but Chiu's complaint also cited US code specifically aimed at banning visual depictions of any kind—including a drawing, cartoon, sculpture, or painting—that depicts a minor engaging in sexually explicit conduct. The city attorney also argued that California consumer protection law alone could be enough to shut down sites.

"Defendants' acts and practices of creating nudified images constitute unfair business practices because they offend established public policy, the harm they cause to consumers greatly outweighs any benefits associated with those practices, and they are immoral, unethical, oppressive, unscrupulous and/or substantially injurious to consumers," Chiu's complaint said.
 

Attachments

Under that logic, simple possesion of CSEM/CSAM should not be persecuted, as the harm has already been done in the past and the act of downloading it itself is not (directly) creating further harm.
Perpetuating demand for CSAM is bad because the supply innately necessitates further harm to be produced. You could make a nude editing model today with all the 50-state legal porn around that will generate images forever without any need for more legal porn, least of all more physical harm.

I think it's cringe to use a model to make nude edits of candid photographs, and it already is a crime to try passing off these fakes as the real Mccoy, but I'm not supporting outlawing it. Women have already settled the matter that their bodies are commodities with the glorification of sex work and surrogacy. There is no decency to protect in vanilla nude stills being made when it's empowering to buy a house selling foot pictures and queef videos.
 
Imagine a fucking AI actively monitoring your drawings and stopping you or phoning the police if it sees you start to draw taylor swift. Thats the future they want.
THIS IS THE POLICE, PUT THE STYLUS DOWN AND STEP AWAY FROM THE SURFACE PRO. DO NOT CONTINUE TO DRAW TAYTAY FUCKING TELLY MONSTER OR WE WILL OPEN FIRE.
 
5 minutes in photoshop or GIMP can achieve the same thing as these "ai services". Are they gonna sue those too? no? Just these? Ok lmao.
The point is that this AI shit streamlines the process and basically gives any low IQ pedophile the ability to create blackmail CP content en masse, whereas photoshop and GIMP relies on some amount of skill and time to make so there's a barrier of entry.

Literally before AI became a big thing photoshop was seen as an actual (yet niche) skill where you'd pay experts to photoshop an image for a specific purpose but now that's gone the way of the dodo.

Though tbh, you'd think with all this data selling and invasion of privacy these AI companies are doing they would at least have a 'hard block' list on images that look too young? :roll:
 
  • Agree
Reactions: Toji Suzuhara
The whole purpouse of those apps is to undress other people, meanwhile image editing software, while it has the capability of doing "the same", has not that a its main selling point. It's like how a movie with a sex scene is not the same as a porno

The PR line is always "think of the women and children" and then suddenly only a select few control AI. 🤷‍♂️ 🤷‍♂️

same ol' same ol'
 
The PR line is always "think of the women and children" and then suddenly only a select few control AI. 🤷‍♂️ 🤷‍♂️

same ol' same ol'
Yeah, but there's actual women and children here. And it also isn't as if they haven't gutted most AI tools already
 
Yeah, but there's actual women and children here. And it also isn't as if they haven't gutted most AI tools already

And they'll continue to gut them in the name of safety. Until they remove the AI tools to check if anything they show you is fake or not.

 
  • Thunk-Provoking
Reactions: Toji Suzuhara
The point is that this AI shit streamlines the process and basically gives any low IQ pedophile the ability to create blackmail CP content en masse, whereas photoshop and GIMP relies on some amount of skill and time to make so there's a barrier of entry.

Literally before AI became a big thing photoshop was seen as an actual (yet niche) skill where you'd pay experts to photoshop an image for a specific purpose but now that's gone the way of the dodo.

Though tbh, you'd think with all this data selling and invasion of privacy these AI companies are doing they would at least have a 'hard block' list on images that look too young? :roll:
How would you create blackmail CP when the person being blackmailed could just notify the cops that the blackmailer has not real nude images of you but is using AI shit to blackmail you?
 
How would you create blackmail CP when the person being blackmailed could just notify the cops that the blackmailer has not real nude images of you but is using AI shit to blackmail you?
AI is getting better at looking more and more realistic that's why. Even if it's obvious to say that they're fake it can ruin someone's reputation permanently, but tbh I am sorta jumping to extreme cases I guess.
 
AI is getting better at looking more and more realistic that's why. Even if it's obvious to say that they're fake it can ruin someone's reputation permanently, but tbh I am sorta jumping to extreme cases I guess.
I mean, on the flip side, couldn't everyone now just reasonably say that's AI and not real to any nude images that show up on the internet of them?
 
  • Agree
Reactions: Wendell Stamps DoL
I mean, on the flip side, couldn't everyone now just reasonably say that's AI and not real to any nude images that show up on the internet of them?
If it looks real enough then a significant number of people will just not believe them, esp because sending nudes is considered normal for some reason. As for blackmailing kids, they're not the brightest, and I could easily see some ped successfully using it as blackmail anyway for the same reason--it's not about whether it's true, it's about how many people will believe it and what the fallout of that is. The lie flies halfway around the world while the truth is still putting its shoes on and such.

(Also, the point of making your own nudes/porn of someone isn't the lack of availability of porn but of transgressing against that particular individual or making your own content for someone you're obsessed with. It's anti-social behavior IMO. It's content/a behavior they're not giving to you, so you find a way to take it anyway. Though I don't understand how they plan to legislate around that, IMO it could safely fall under defamation of some sort rather than needing new special laws. Like, if it's believable to enough people then the person who made it and shared it is effectively spreading a lie about you and if has negative social consequences for you that would apply right?)
 
If it looks real enough then a significant number of people will just not believe them, esp because sending nudes is considered normal for some reason. As for blackmailing kids, they're not the brightest, and I could easily see some ped successfully using it as blackmail anyway for the same reason--it's not about whether it's true, it's about how many people will believe it and what the fallout of that is. The lie flies halfway around the world while the truth is still putting its shoes on and such.

Remember back in the day when guys would post almost MSpaint copy pasted celeb faces on nude models and say it was real? I swear we need something to gatekeep people from using the internet that are under 110 IQ 😐😐
 
Back