KR With a click, 220,000+ member Korean Telegram room generates illegal deepfake porn - A Telegram group was discovered by the activists in which men would make AI Deep Fakes of various women, some of who are minors, without their consent

Link | Archive
Note: I will add additional information at the bottom of this post since the article doesn't cover everything

Experts say the channel’s profit structure suggests a high level of demand and a lack of understanding that the images it creates are illegal
1724784145203.png
Screenshots from inside the over 220,000-member Telegram chat room that generated deepfakes of women using AI technology.

Pornographic deepfakes, or digitally altered images generated through AI technology, are being widely distributed on the Telegram messaging app in Korea, including on one Telegram channel that produces deepfake nude images on demand for over 220,000 members. The channel, which was easily accessed through a basic online search, charges money for the fabricated images, which are based on photographs of actual people. Experts say the channel is a stark illustration of the current state of deepfake pornography — which many people are not even aware is illegal.

A Telegram channel that the Hankyoreh accessed through a link on X (formerly known as Twitter) on Wednesday features an image bot that converts photographs of women uploaded to the channel into deepfake nude images. When the Hankyoreh entered the channel, a message popped up asking the user to “upload a photograph of a woman you like.”

The Hankyoreh uploaded an AI-generated photograph of a woman and within five seconds, the channel generated a nude deepfake of that photograph. The deepfake tool even allows users to customize body parts in the resulting image. As of Wednesday, Aug. 21, the Telegram channel had some 227,000 users.

The Telegram channel was very easy to access. A search for specific terms on X and other social media brought up links to the channel, and one post with a link was even featured as a “trending post” on X. Despite recent coverage of sex crimes involving deepfakes and related police investigations, posts are still going up to promote the channel.

The Telegram channel generates up to two deepfake nudes for free but requires payment for further images. Each photograph costs one “diamond” — a form of in-channel currency worth US$0.49, or about 650 won — and users are required to purchase a minimum of 10 diamonds, with discounts available for bulk purchases. Diamonds are also given out for free to users who invite friends to the channel, in an obvious attempt to broaden the user base. Cryptocurrency is the only accepted form of payment, likely for anonymity reasons.

The chat room does not allow users to send messages or pictures to one another. The trouble is that there is no way of knowing how the deepfakes the channel generates are being used by members.

“Sexually exploitative deepfake images created in Telegram rooms like these get shared in other group chat rooms, that that’s how you end up with mass sex crimes,” said Seo Hye-jin, the human rights director at the Korean Women Lawyers Association.

“If there are over 220,000 people in the Telegram room at the stage where these images are being manufactured, the damage from distribution is likely to be massive,” she assessed.

The existence of such a huge Telegram channel with a revenue-generating model is likely a reflection of the reality in which creating highly damaging deepfakes is seen as no big deal.

“The fact that there’s a profit structure suggests that there’s a high level of demand,” said Kim Hye-jung, the director of the Korea Sexual Violence Relief Center.

“Despite the fact that sexually degrading women has become a form of ‘content’ online, the social perceptions of this as a minor offense is a key factor that plays into sex crimes,” she went on.

By Park Go-eun, staff reporter

Telegram rooms were discovered in over 70% of South Korean schools where female students' faces were photoshopped into porn using AI. Girls in South Korea created a list of schools to check if they were victims. This is just a part of it
1724784362607.png
1724784402470.png1724784424319.png1724784441660.png
The sample of the chat room from their Telegram:
1724784557459.png
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
1724784681700.png
 
Wtf is with all these faggot pedophiles suddenly coming into threads thinking they're going to convince everyone that it's okay to look at kiddy porn cause a computer made it?
I never said CP (or having sex with children) is good, all I said is that the media has and always will lies. Theres no actual proof these people generated AI CP.
 
1. There's no proof that this group generated AI CP, you think the media can't lie?
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
1724794317880.png

If you think all those >500 schools are all filled with nothing but adult students you are fucking retarded
1724794376099.png

middle school, you faggot
I never said CP (or having sex with children) is good, all I said is that the media has and always will lies. Theres no actual proof these people generated AI CP.
Based on your past comments you dont think ai generated cp is an issue.
 
God damn it. A quarter of the internet wants to fuck children, another quarter wants to troon them out, another quarter wants to use them as a political tool, and the last quarter that supposedly exists entirely to "defends children" are clout chasers and hypocrites like MamaMax. Really, giving a phone to a kid in this day and age should count as child abuse, I feel for the alphas and betas, what a sad existance awaits them
 
I am probably late, but doesn't this kind of stuff require images to go off of? Why would they be able to make CSAM then?
Plenty of AI image generators can make pictures of things that have never been seen before in real life, like Spongebob flying a plane into the twin towers. It can do this because pictures of Spongebob, plane cockpits, and the twin towers are all in its dataset.
So if an AI model had both pictures of children and porn in its training data, then it will be able to generate fake CP. This is why the current governance apparatus is hell-bent on putting legal guardrails on open-source AI models, by the way.
 
I am probably late, but doesn't this kind of stuff require images to go off of? Wouldn't they be unable to make CSAM without the images?
Yes as this requires (most of the time) either a base image of children to undress (thus creating child pornography) or to be fed a database of already-naked children (thus being fed literal child pornography of physically abused kids)
 
View attachment 6354037
If you think all those >500 schools are all filled with adults you are fucking retarded

Based on your past comments you dont think ai generated cp is an issue.
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
>feminist
Come on now.

This is a completely diffrent issue, and you derailed the conversation to talk about AI CP.
 
Plenty of AI image generators can make pictures of things that have never been seen before in real life, like Spongebob flying a plane into the twin towers. It can do this because pictures of Spongebob, plane cockpits, and the twin towers are all in its dataset.
So if an AI model had both pictures of children and porn in its training data, then it will be able to generate fake CP. This is why the current governance apparatus is hell-bent on putting legal guardrails on open-source AI models, by the way.
The AI needs a photo of spongebob and a photo of the twin towers in order to make that new image

The AI needs a child's nude body in order to depict a nude child body and not an adult's nude body with a child's head slapped on.
>feminist
Come on now.
This is a completely diffrent issue, and you derailed the conversation to talk about AI CP.

Cope and seethe, pedo nigger.

THESE SCHOOLS WERE ALL TARGETS OF DEEPFAKE PORN. INCLUDING OF CHILDREN IN MIDDLE SCHOOL.
1724794713350.png

AND ELEMENTARY SCHOOL
 
The AI needs a photo of spongebob and a photo of the twin towers in order to make that new image

The AI needs a child's nude body in order to depict a nude child body and NOT an adult's nude body.


COPE AND SEETHE, PEDO NIGGER

THESE SCHOOLS WERE ALL TARGETS OF DEEPFAKE PORN. INCLUDING OF CHILDREN.
This conversation is going nowhere. Have a nice day.
 
I'm sorry, but the cat is out of the bag with this tech, and at some point people will have to learn to ignore it. And to stop putting pictures of themselves online. lern2opsec
So if an AI model had both pictures of children and porn in its training data, then it will be able to generate fake CP. This is why the current governance apparatus is hell-bent on putting legal guardrails on open-source AI models, by the way.
Well, that's the excuse they use, anyway.
 
I'm sorry, but the cat is out of the bag with this tech, and at some point people will have to learn to ignore it. And to stop putting pictures of themselves online. lern2opsec
You are witnessing an attempt to get women (many of who constantly live on emotional autopilot) to manufacture consent for any sorts of shit-tier policy that will make it harder to use AI at all. I have explained the reason for this multiple times in other posts.
 
I really feel like "Telegram" is the wrong thing to focus on here, but the one which journalists and governments will be the most interested in.
They detained Dr. Durov. The smear campaign aganist Telegram is in full swing, and what better for that goal than a literal 220k-strong child porn distribution ring?
 
I am probably late, but doesn't this kind of stuff require images to go off of? Wouldn't they be unable to make CSAM without the images?

Not quite, you could train it with regular if young looking porn, then mix in the bodies of children and the AI could given enough time and tweaking make realistic fake CSAM.

I have said it before but I think this stuff shouldn't get a pass because of this. Even though technically it isn't real much like the drawn or 3D generated stuff which is legal it is close enough to fool most people unlike a drawing or animation which makes it so there is a reasonable doubt of the nature of it. Making it be treated just like the real stuff is the best way to deal with it as it keeps pedos from generating or sharing it, and if they insist on doing it they won't be able to hide under the "akshually it AI you are gonna have to prove my 12TB collection is real".

I feel like the same people defending this shit are the same ones that defend those baby sex dolls.

Its not a real child its not hurting anyone! Its easy to tell the difference between this rubber one and the ones I watch from the bushes across the street from the school!

That sort of thinking doesn't fly IMO even if you steel man because of the point I made of it looking realistic enough to fool most people.
 
Making it be treated just like the real stuff is the best way to deal with it as it keeps pedos from generating or sharing it, and if they insist on doing it they won't be able to hide under the "akshually it AI you are gonna have to prove my 12TB collection is real".
This is likely going to be what happens. Which will eventually lead to "OI BRUV U GOT A LOICENSE FOR THAT GPU?"
 
This is likely going to be what happens. Which will eventually lead to "OI BRUV U GOT A LOICENSE FOR THAT GPU?"

Well no, you only need to treat the pictures not the generator itself.

But sadly you are likely right because the powers that be REALLY want to control everything.
 
Well no, you only need to treat the pictures not the generator itself.

But sadly you are likely right because the powers that be REALLY want to control everything.
A law means nothing if it can't be enforced. It is likely that making GPU licenses a thing will cause too much collateral damage though ($NVDA will tank, rich investors will absolutely lose their shit,) so widespread mandatory surveillnace technology embedded inside consoomer-grade operating systems is a much more likely outcome.
 
Back