KR With a click, 220,000+ member Korean Telegram room generates illegal deepfake porn - A Telegram group was discovered by the activists in which men would make AI Deep Fakes of various women, some of who are minors, without their consent

Link | Archive
Note: I will add additional information at the bottom of this post since the article doesn't cover everything

Experts say the channel’s profit structure suggests a high level of demand and a lack of understanding that the images it creates are illegal
1724784145203.png
Screenshots from inside the over 220,000-member Telegram chat room that generated deepfakes of women using AI technology.

Pornographic deepfakes, or digitally altered images generated through AI technology, are being widely distributed on the Telegram messaging app in Korea, including on one Telegram channel that produces deepfake nude images on demand for over 220,000 members. The channel, which was easily accessed through a basic online search, charges money for the fabricated images, which are based on photographs of actual people. Experts say the channel is a stark illustration of the current state of deepfake pornography — which many people are not even aware is illegal.

A Telegram channel that the Hankyoreh accessed through a link on X (formerly known as Twitter) on Wednesday features an image bot that converts photographs of women uploaded to the channel into deepfake nude images. When the Hankyoreh entered the channel, a message popped up asking the user to “upload a photograph of a woman you like.”

The Hankyoreh uploaded an AI-generated photograph of a woman and within five seconds, the channel generated a nude deepfake of that photograph. The deepfake tool even allows users to customize body parts in the resulting image. As of Wednesday, Aug. 21, the Telegram channel had some 227,000 users.

The Telegram channel was very easy to access. A search for specific terms on X and other social media brought up links to the channel, and one post with a link was even featured as a “trending post” on X. Despite recent coverage of sex crimes involving deepfakes and related police investigations, posts are still going up to promote the channel.

The Telegram channel generates up to two deepfake nudes for free but requires payment for further images. Each photograph costs one “diamond” — a form of in-channel currency worth US$0.49, or about 650 won — and users are required to purchase a minimum of 10 diamonds, with discounts available for bulk purchases. Diamonds are also given out for free to users who invite friends to the channel, in an obvious attempt to broaden the user base. Cryptocurrency is the only accepted form of payment, likely for anonymity reasons.

The chat room does not allow users to send messages or pictures to one another. The trouble is that there is no way of knowing how the deepfakes the channel generates are being used by members.

“Sexually exploitative deepfake images created in Telegram rooms like these get shared in other group chat rooms, that that’s how you end up with mass sex crimes,” said Seo Hye-jin, the human rights director at the Korean Women Lawyers Association.

“If there are over 220,000 people in the Telegram room at the stage where these images are being manufactured, the damage from distribution is likely to be massive,” she assessed.

The existence of such a huge Telegram channel with a revenue-generating model is likely a reflection of the reality in which creating highly damaging deepfakes is seen as no big deal.

“The fact that there’s a profit structure suggests that there’s a high level of demand,” said Kim Hye-jung, the director of the Korea Sexual Violence Relief Center.

“Despite the fact that sexually degrading women has become a form of ‘content’ online, the social perceptions of this as a minor offense is a key factor that plays into sex crimes,” she went on.

By Park Go-eun, staff reporter

Telegram rooms were discovered in over 70% of South Korean schools where female students' faces were photoshopped into porn using AI. Girls in South Korea created a list of schools to check if they were victims. This is just a part of it
1724784362607.png
1724784402470.png1724784424319.png1724784441660.png
The sample of the chat room from their Telegram:
1724784557459.png
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
1724784681700.png
 
I'd still ignore it and I'd have the plausible deniablilty of calling it a deepfake. You (and your sockpuppets in this thread) and clearing keen on banning AI in general and keeping the masses weak and dumb.
>sockpuppets
bitch where

1. Anyways, you can't ignore literal deepfake child porn. That's a federal crime.
2. Not many people are going to buy the whole "IT'S A DEEPFAKE!!!" shit when there's no proof that it's a deepfake (even when it is). This is nonconsensual sexual harassment. You don't just ignore that shit. Watch the movie and tell those minors to just ignore it.
 
...and there's absolutely wrong with that. "Deepfake" is just bullshit fearmongering. If I had fake porn made of me, I'd just ignore it.
There have already been many instances of people using fake images/audio to try to get people fired/arrested you fuckwit. This shit isn't just posted in discord and telegram for people to laugh at.

Also people making deepfake CP absolutely deserve to be fed into a wood chipper. Fuck those guys.
 
Who MIGHT be harmed if the images come out. Reputational damage is a characteristic of the social contract, making the most sensible way to handle this to be for society to not shame underage girls who get deepfakes made of them, indeed to shame the people who make them instead. Ironically there was a similar issue almost exactly 100 years ago when realistic paintings of women were mass produced for calendars and postcards and the like. Somehow we got through it without banning people from drawing naked women. Of course back them half of us weren't raised by daddy government.
I dont know, the children that were systematically used and abused and fed into a soulless machine to create child porn of them, which is what this is. That's who was harmed.
 
This is the same guy that also said this
View attachment 6353988

Stands to reason that @PLEASEBANMYASS doesn't believe that deepfake child porn is real child porn and doesn't think it needs prison time as a punishment.
Its not even a debate, deepfake child porn is exactly the same as "real" child porn in the eyes of the law. That's just a fact. Both deserve prison time and both deserve a wood chipper. It's not a free speech issue, its not even a AI issue, its illegal flat out.
 
>sockpuppets
bitch where

1. Anyways, you can't ignore literal deepfake child porn. That's a federal crime.
2. Not many people are going to buy the whole "IT'S A DEEPFAKE!!!" shit when there's no proof that it's a deepfake (even when it is). This is nonconsensual sexual harassment. You don't just ignore that shit. Watch the movie and tell those minors to just ignore it.
1. There's no proof that this group generated AI CP, you think the media can't lie?
2. AI is very easy to tell apart from the real thing. And also, no it's not. There's no psysical touching, so ir's not 'nonconsentual'. I'm not watching your movie.

There have already been many instances of people using fake images/audio to try to get people fired/arrested you fuckwit. This shit isn't just posted in discord and telegram for people to laugh at.

Also people making deepfake CP absolutely deserve to be fed into a wood chipper. Fuck those guys.
They should just get a new job.
 
I'd still ignore it and I'd have the plausible deniablilty of calling it a deepfake. You (and your sockpuppets in this thread) and clearing keen on banning AI in general and keeping the masses weak and dumb.

I dont want to ban AI, but if its clear a model has been trained off of CSAM, or minors in general, then its an issue that needs to be addressed. If someone is feeding that shit to the AI to train it, that person should get in trouble its that simple.

This thread is about how some people are selling a make your own deepfake service, which is being used by students to make porn of other students, some are minors.

2. AI is very easy to tell apart from the real thing
Thats not the point, the end result isnt what matters, its how it got there, what it saw to generate that image.
 
Pretty sure the article spells it out that it happened, dipshit.

No it's not, not always.

THIS NIGGER THINKS IT'S OKAY TO GENERATE AI CHILD PORN
1. It literally didn't. (((MSM))) lies all the time, why should I believe them?
2. Completely unrelated. Why are you nitpicking from something I postrd a billion years ago?
I dont want to ban AI, but if its clear a model has been trained off of CSAM, or minors in general, then its an issue that needs to be addressed. If someone is feeding that shit to the AI to train it, that person should get in trouble its that simple.

This thread is about how some people are selling a make your own deepfake service, which is being used by students to make porn of other students, some are minors.


Thats not the point, the end result isnt what matters, its how it got there, what it saw to generate that image.
No proof this model was trained with CP.

And the end result is easy to tell from reality.
 
I am probably late, but doesn't this kind of stuff require images to go off of? Wouldn't they be unable to make CSAM without the images?
 
Korean men are really some of the shittiest scum to walk the earth. Of course it all begs the question of how many of these things are floating around everywhere and why can't a certain demographic keep their hands out of their pants. Cool, more black pills, that's nice. If you want some more Korean black pills start with the Burning Sun and go from there.
 
Yeah, until the Miller Test says it actually is not and everyone that did this or a similar thinf to this (in the US) goes to prison
If it has artistic value, then it fails the miller test.
If you take a teenager's head and put it on a nude female body that's still cp. The intent is to use a minor to get off to
Teenagers aren't really easy to tell apart from 18+. I've seen many actual adults that resemble children.
I feel like the same people defending this shit are the same ones that defend those baby sex dolls.

Its not a real child its not hurting anyone! Its easy to tell the difference between this rubber one and the ones I watch from the bushes across the street from the school!
Non sequitur.
 
Back