RU Revealed: the names linked to ClothOff, the deepfake pornography app - Exclusive: Guardian investigation for podcast series Black Box reveals names connected to app that generated nonconsensual images of underage girls around the world


Michael Safi and Alex Atack in Almendralejo, Spain, and Joshua Kelly in London
Thu 29 Feb 2024

Prologue: the collision – podcast

703.jpg
An image of ‘Ewan Liam Torres’ likely to have been generated by AI.

The first Miriam al-Adib learned of the pictures was when she returned home from a business trip. “Mum,” said her daughter. “I want to show you something.”

The girl, 14, opened her phone to show an explicit image of herself. “It’s a shock when you see it,” said Adib, a gynaecologist in the southern Spanish town of Almendralejo and a mother of four daughters. “The image is completely realistic … If I didn’t know my daughter’s body, I would have thought that image was real.”

It was a deepfake, one of dozens of nude images of schoolgirls in Almendralejo that had been generated by artificial intelligence (AI) and which had been circulating in the town for weeks in a WhatsApp group set up by other schoolchildren.

Some of the girls whose likenesses were being spread were refusing to go to school, suffering panic attacks, being blackmailed and getting bullied in public. “My concern was that these images had reached pornographic sites that we still don’t know about today,” Adib told the Guardian from her clinic in the town.

State prosecutors are considering charges against some of the children, who created the images using an app downloaded from the internet. But they had been unable to identify the people who developed the app, who prosecutors suspect are based somewhere in eastern Europe, they said.

The Spanish incident flared into global news last year and made Almendralejo, a small town of faded renaissance-era churches and plazas near the Portuguese border, the site of the latest in a series of warning shots from an imminent future where AI tools allow anyone to generate hyper-realistic images with a few clicks.

But while deepfakes of pop stars such as Taylor Swift have generated the most attention, they represent the tip of an iceberg of nonconsensual images that are proliferating across the internet and which police are largely powerless to stop.

As Adib was learning of the pictures, thousands of miles away at the Westfield high school in New Jersey, a strikingly similar case was playing out: many girls targeted by explicit deepfake images generated by students in their classes. The New Jersey incident has prompted a civil lawsuit and helped fuel a bipartisan effort in the US Congress to ban the creation and spread of nonconsensual deepfake images.

At the centre of both the incidents in Spain and New Jersey was the same app, called ClothOff.

In the year since the app was launched, the people running ClothOff have carefully guarded their anonymity, digitally distorting their voices to answer media questions and, in one case, using AI to generate an entirely fake person who they claimed was their CEO.

768.jpg
A picture of ‘Ewan Liam Torres’, who ClothOff claims is its CEO, but which is likely to be an AI-generated image.

But a six-month investigation, conducted for a new Guardian podcast series called Black Box, can reveal the names of several people who have done work for ClothOff or who our investigation suggests are linked to the app.

Their trail leads to Belarus and Russia but passes through businesses registered in Europe and front companies based in the heart of London.

ClothOff, whose website receives more than 4m monthly visits, invites users to “undress anyone using AI”. The app can be accessed through a smartphone by clicking a button that confirms the user is over 18, and charges approximately £8.50 for 25 credits.

The credits are used to upload photographs of any woman or girl and return the same image stripped of clothing.

A brother and sister in Belarus

Screenshots seen by the Guardian indicate that a Telegram account in the name of Dasha Babicheva, who social media accounts suggest is in her mid-20s and lives in the Belarus capital, Minsk, has conducted business on ClothOff’s behalf, including discussing applications to banks, changes to the website and business partnerships.

1170.jpg
A profile picture from a Telegram account in the name of Dasha Babicheva.

In one screenshot, the account in Babicheva’s name tells a counterpart at another firm that if journalists have questions about ClothOff, “they can contact us on this email”, providing the website’s press contact.

An Instagram account in Babicheva’s name, which shared some of the same images with the Telegram account in her name and which listed the same phone number, was made private after the Guardian started making inquiries, and the phone number was deleted from the profile.

Babicheva did not respond to detailed questions.

1111.jpg
A profile picture taken from the LinkedIn account in the name of Alaiksandr Babichau.

Alaiksandr Babichau, 30, identified in social media accounts as Dasha Babicheva’s brother, also appears to be closely linked to ClothOff.

In a recruitment advertisement, ClothOff directed applicants to an email address from the website AI-Imagecraft.

Domain-name records for AI-Imagecraft show the website owner’s name has been hidden at the owner’s request.

But AI-Imagecraft has a virtually identical duplicate website, A-Imagecraft, whose owner has not been hidden: it is listed as Babichau. The Guardian was able to log in to both A-Imagecraft and AI-Imagecraft using the same username and password, indicating the two websites are linked.

There are further links between Babichau and ClothOff. The Guardian has seen screenshots of conversations between ClothOff staff and a potential business partner. The ClothOff staff are identified only by their first names and one of them, identified by another staff member as the “founder”, had the Telegram display name “Al”.

The Guardian compared videos posted to Al’s Telegram account with publicly available footage posted to an account in the name of Alaiksandr Babichau. It showed that both Al and Babichau had uploaded videos and photos showing the same hotel in Macau on 24 January, and from rooms in the same Hong Kong hotel on 26 January. The correlation suggests the two accounts either belong to people who travelled to the cities at the same time, or to the same person.

Reached over the phone last week, Babichau denied any connection to the deepfake app, claimed he did not have a sister named Dasha, and said a Telegram account in his name, that listed his phone number, did not belong to him. In response to further inquiries, he abruptly ended the phone call and has not responded to detailed questions by email.

Shortly after the conversation, the Guardian was blocked by the Telegram account he claimed did not belong to him.

1610.jpg

A money trail through London

Payments to ClothOff revealed the lengths the app’s creators have taken to disguise their identities. Transactions led to a company registered in London called Texture Oasis, a firm that claims to sell products for use in architectural and industrial-design projects.

But the company appears to be a fake business designed to disguise payments to ClothOff.

The text on the firm’s website has been copied from the website of another, legitimate, business, as was a list of staff members. When the Guardian contacted one of the people listed as a Texture Oasis employee, he said he had never heard of the business. Our investigation has found no other links between the named staff and ClothOff, adding to the suggestion the staff list has been copied.

The Guardian has also unearthed links between ClothOff and an online video-game marketplace called GGSel, described by its CEO as a way for Russian gamers to circumvent western sanctions.

Both websites briefly listed the same business address last year: a company based in London called GG Technology Ltd, registered to a Ukrainian national named Yevhen Bondarenko. Both websites have since deleted any reference to the firm.

The LinkedIn account in Babichau’s name lists him as a GGSel employee.

Meanwhile, an account in the name of Alexander German, described as a web developer whose LinkedIn says he also works at GGSel, uploaded website code for ClothOff to an account in his name on GitHub, a coding repository. This source code was deleted a short time later.

Reached by the phone number listed on his LinkedIn, someone who identified himself as Alexander German denied he was a web developer or linked in any way to ClothOff.

Several LinkedIn accounts that listed their employment at GGSel on their profiles deleted any reference to the company or removed their surnames and pictures after the Guardian started making inquiries about links between GGSel and ClothOff.

In a statement, GGSel denied any involvement with ClothOff and said it had no connection to GG Technology Ltd, but could not or did not explain why the company was listed on its website as its owner last year. It said neither Babichau nor German had ever been employees and that it would contact LinkedIn to ask them to remove the references from the profiles in their names.

Bondarenko deleted his social media accounts on Wednesday and the Guardian was unable to reach him for comment.

ClothOff said in response to questions that it had no connection with GGSel nor any of those named in this article. A spokesperson claimed it was impossible to use its app to “process” the images of people under the age of 18 but did not specify how or why – nor how images, including of children, were generated by the app in Spain. They speculated the images in New Jersey may have been created using a competitor service.

On Thursday, access to the ClothOff website and app appeared to have been blocked in the UK, but they were still available elsewhere.

The investigation has shown the growing difficulty of distinguishing real people from fake identities that can be accompanied by high-quality photographs, videos and even audio. A fuller account of this story will be published in an episode of Black Box to be released next Thursday.
  • Additional reporting by Matteo Fagotto, Phil McMahon, Oliver Laughland, Manisha Ganguly, Andrew Roth, Yanina Sorokina and Kateryna Malofieieva.
  • Do you know more about this story? Contact michael.safi@theguardian.com
 
Edit: is this article an ad?
Definitely not on purpose. They want these apps blocked, deplatformed, or shut down by raids.
State prosecutors are considering charges against some of the children, who created the images using an app downloaded from the internet. But they had been unable to identify the people who developed the app, who prosecutors suspect are based somewhere in eastern Europe, they said.
But while deepfakes of pop stars such as Taylor Swift have generated the most attention, they represent the tip of an iceberg of nonconsensual images that are proliferating across the internet and which police are largely powerless to stop.
On Thursday, access to the ClothOff website and app appeared to have been blocked in the UK, but they were still available elsewhere.
 
Edit: is this article an ad?
I suspect the same, Things like that are hard to advertise conventionally without tarnishing the ad hoster's reputation, so "Look at this wicked thing, you 100% shouldn't use to generate what you want" is a viable tactic.
Now, this is different from the app that puts clothes ON thots, that's totally different and somehow equally as bad..
The problem I have with Dignif.ai threads on 4chan is they give exposure to the e-thots by requiring the @handle and nude pictures before they can be changed.
Also, I don't like how some changed the race of the half-Black children of single mothers, one can be opposed to e-thotting and support racial harmony.
deepfakes of pop stars such as Taylor Swift
If people actually saw those images, they wouldn't see it as an attack on TS or women.
isthisreal.png

As an aside, a similar tactic was done to smear Tucker Carlson as a racist bigot for his human sympathy for curbstomped antifa.
EXHIBIT 276: Tucker Carlson's Text to a Producer on January 7, 2021
A couple of weeks ago, I was watching video of people fighting on the street in
Washington. A group of Trump guys surrounded an Antifa kid and started
pounding the living shit out of him. It was three against one, at least. Jumping a
guy like that is dishonourable obviously. It's not how white men fight. Yet suddenly
I found myself rooting for the mob against the man, hoping they'd hit him
harder, kill him. I really wanted them to hurt the kid. I could taste it. Then
somewhere deep in my brain, an alarm went off: this isn't good for me. I'm
becoming something I don't want to be. The Antifa creep is a human being.
Much as I despise what he says and does, much as I'm sure I'd hate him
personally if I knew him, I shouldn't gloat over his suffering. I should be bothered
by it. I should remember that somewhere somebody probably loves this kid, and
would be crushed if he was killed. If I don't care about those things, if I reduce
people to their politics, how am I better than he is?
 
Last edited:
It does and rejected one I tried of an adult with a youngish face. I doubt it would reject a busty high school girl.
I only skimmed the article, but how is it being used on 14 year olds if it even rejects youthful looking adults? Seems like it needs its systems refined.
 
Ragebait like this will continue to be pumped out to be consoomed by NPC foids until they pull a prohibition 2: electric jigaboo and force a new amendment limiting AI models to only be legally owned and operated by corporations.

It's too easy for corporate causes like this to psyop women into doing their bidding and be outraged at whatever they want to meet their ends. I hate this gay planet.
 
If I didn’t know my daughter’s body, I would have thought that image was real.”
...why and how do you know your 14-year-old daughter's naked body enough to tell it apart from an AI-generated 14-year-old naked body?
 
They really need to develop and app for "Here's what you would look like if you were Black or Chinese"
1709414208783.png
Or better yet, Black and Chinese, it's annoying being the only Blasian online.
why and how do you know your 14-year-old daughter's naked body enough to tell it apart from an AI-generated 14-year-old naked body
Because they started washing and bathing them 14 years ago
 
Ragebait like this will continue to be pumped out to be consoomed by NPC foids until they pull a prohibition 2: electric jigaboo and force a new amendment limiting AI models to only be legally owned and operated by corporations.
We already had the AI art debacle cause a load of independent artists to agitate for "producing art in another person's style" to become a breach of IP law, which would obviously just massively expand the already huge amount of power granted to the giant IP holders like Disney.
I don't have high hopes, basically.
 
Back