KR With a click, 220,000+ member Korean Telegram room generates illegal deepfake porn - A Telegram group was discovered by the activists in which men would make AI Deep Fakes of various women, some of who are minors, without their consent

Link | Archive
Note: I will add additional information at the bottom of this post since the article doesn't cover everything

Experts say the channel’s profit structure suggests a high level of demand and a lack of understanding that the images it creates are illegal
1724784145203.png
Screenshots from inside the over 220,000-member Telegram chat room that generated deepfakes of women using AI technology.

Pornographic deepfakes, or digitally altered images generated through AI technology, are being widely distributed on the Telegram messaging app in Korea, including on one Telegram channel that produces deepfake nude images on demand for over 220,000 members. The channel, which was easily accessed through a basic online search, charges money for the fabricated images, which are based on photographs of actual people. Experts say the channel is a stark illustration of the current state of deepfake pornography — which many people are not even aware is illegal.

A Telegram channel that the Hankyoreh accessed through a link on X (formerly known as Twitter) on Wednesday features an image bot that converts photographs of women uploaded to the channel into deepfake nude images. When the Hankyoreh entered the channel, a message popped up asking the user to “upload a photograph of a woman you like.”

The Hankyoreh uploaded an AI-generated photograph of a woman and within five seconds, the channel generated a nude deepfake of that photograph. The deepfake tool even allows users to customize body parts in the resulting image. As of Wednesday, Aug. 21, the Telegram channel had some 227,000 users.

The Telegram channel was very easy to access. A search for specific terms on X and other social media brought up links to the channel, and one post with a link was even featured as a “trending post” on X. Despite recent coverage of sex crimes involving deepfakes and related police investigations, posts are still going up to promote the channel.

The Telegram channel generates up to two deepfake nudes for free but requires payment for further images. Each photograph costs one “diamond” — a form of in-channel currency worth US$0.49, or about 650 won — and users are required to purchase a minimum of 10 diamonds, with discounts available for bulk purchases. Diamonds are also given out for free to users who invite friends to the channel, in an obvious attempt to broaden the user base. Cryptocurrency is the only accepted form of payment, likely for anonymity reasons.

The chat room does not allow users to send messages or pictures to one another. The trouble is that there is no way of knowing how the deepfakes the channel generates are being used by members.

“Sexually exploitative deepfake images created in Telegram rooms like these get shared in other group chat rooms, that that’s how you end up with mass sex crimes,” said Seo Hye-jin, the human rights director at the Korean Women Lawyers Association.

“If there are over 220,000 people in the Telegram room at the stage where these images are being manufactured, the damage from distribution is likely to be massive,” she assessed.

The existence of such a huge Telegram channel with a revenue-generating model is likely a reflection of the reality in which creating highly damaging deepfakes is seen as no big deal.

“The fact that there’s a profit structure suggests that there’s a high level of demand,” said Kim Hye-jung, the director of the Korea Sexual Violence Relief Center.

“Despite the fact that sexually degrading women has become a form of ‘content’ online, the social perceptions of this as a minor offense is a key factor that plays into sex crimes,” she went on.

By Park Go-eun, staff reporter

Telegram rooms were discovered in over 70% of South Korean schools where female students' faces were photoshopped into porn using AI. Girls in South Korea created a list of schools to check if they were victims. This is just a part of it
1724784362607.png
1724784402470.png1724784424319.png1724784441660.png
The sample of the chat room from their Telegram:
1724784557459.png
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
1724784681700.png
 
I think it's more depressing than anything, especially for a nation that has great potential otherwise, but different shades and all that
Sure, it's sad that people have to live through this shit, but my point is that it's amazing how this situation perfectly lines up as to what a comedy is. Ticked my autistic brain. Anyway I'm a nerd loser and gay
 
  • Like
Reactions: msd
Women should retaliate by generating deepfakes of real men taking dicks up their asses.
What makes you think women weren't already doing that and for same reasons as the men above? Shouldn't surprise anyone at all that something similar like this exists for women but with men and boys.
 
...and there's absolutely wrong with that. "Deepfake" is just bullshit fearmongering. If I had fake porn made of me, I'd just ignore it.
You aren't a women and have a fundamentally different relationship to sex and porn, you fucking retard. You also aren't a middle school girl.

You're so fucking off the mark it concerns me.

Sorry for double post
 
The twitter account that posted this has been suspended, presumably due to mass reporting from Korean moids, but not before they managed to post about other telegram rooms. “telegram group chats named “humiliation-room”. They include “sister-room” “mother-room” where men take and share pictures of groping their female family members’ body when they’re asleep.”

“Humiliation-room” is so common that every women in every aspect of this country(family, elementary school, middle school, high school, university, province, city, work place, literally everywhere) are being victimized.”

”In this “room”, mainly they share normal pictures of their female friends, coworkers to make deepfake porn videos. So if you have selfies on SNS or taken pictures with one of them, you can also be a victim.”

These are translations of the original Korean thread, only this screenshot remains because twitter nuked the account.

IMG_1556.jpeg
 
South Korea has banned pornography yet this does not deter degenerates from getting and creating it. Horrible situation for girls to be in, especially with east asians infatuation with youth/cuteness

But what in the hell is wrong with them? Unlike Japan, they didn't even get nuked to justify the nuclear levels of coom degeneracy
They're influenced by japan (and china). the nuclear degeneracy has spread and infected the rest of the world.
 
Last edited:
  • Like
  • Thunk-Provoking
Reactions: msd and Spud
They're influenced by japan (and china). the nuclear degeneracy has spread and infected the rest of the world.

The US and their push for depravity has also influenced Worst Korea to reach these levels of toxicity, that it makes Best Korea executing people for having USB drives of K-Pop, Anime, Squid Game, and porn, justifiable.

These horrible things that are coming out of Worst Korea, does make a South Korea General thread seem more and more like a good idea, even though the stories out of there are anything but.
 
My current schizo conspiracy is that ai gen'd csam is going to be used as a stalking horse for trying to legalise the real thing.
No
Things were going pretty well there for a while, there was a growing awareness, and repugnance, of lolicon etc and pushes to ban it pretty much all across the West but that was always fringe anyway. Ai csam is going to explode and combine that with the degenerates attempts to lower consent laws etc...
It's way worse than that already. Generated images are reported to the same government and non-government entities like NCMEC that real, extremely harmful images of actual abuse are. I've heard figures like 95% of all reports are either AI or drawings today, much of it comes from tech companies like Facebook or Google who use automated systems to detect this stuff.

Maybe you think there should be some rule not to report AI images or to classify it differently but it's very hard to even tell which is which; automated classification can't do it.

All the systems we have in place to find and protect children in danger are falling apart. Does it even matter what's legal or not if it's almost impossible to stop?

If anything this would be a reason to increase regulation on AI. There is no fucking chance on God's green earth that AI child porn will be legal
All pornography is already illegal in Korea.

Is that working?

Regulating "big" AI services sounds like a grand idea until you understand that generating images can be done client side in a very short amount of time using open models created by small communities. This would be like trying to regulate mp3s in 2000, cat's already out of the bag.

I'm sorry, but the cat is out of the bag with this tech, and at some point people will have to learn to ignore it. And to stop putting pictures of themselves online. lern2opsec
This will be the only way forward. We don't share photos of our kids online and have strict rules on how they are using technology.
 
Last edited:
The women are active men haters who think letting Korea go extinct by not having kids is a legitimate form of feminism
I've never considered myself a feminist but if my choices are dying childless or marring a man who'd make AI porn of his colleagues, family members and children, I'm choosing the former without a moment's hesitation.
 
This is why I fully believe that for every AI and deep fake thing shitposters get shut down after feeding it wrong think, it's a win for us.

Fuck these things
 
All pornography is already illegal in Korea.

Is that working?

Regulating "big" AI services sounds like a grand idea until you understand that generating images can be done client side in a very short amount of time using open models created by small communities. This would be like trying to regulate mp3s in 2000, cat's already out of the bag.
Only the creation and distribution of porn is illegal in South Korea. Consumption is not. I'm not sure what you are implying when you question if the "illegality" of said porn is working, when the topic at hand is CP. Could you explain what you meant?
 
I've never considered myself a feminist but if my choices are dying childless or marring a man who'd make AI porn of his colleagues, family members and children, I'm choosing the former without a moment's hesitation.
If you ever have/had family, fucking off innawoods seems like a better option every day now with the insanity. Unfortunately... We live in a society...
If it has artistic value, then it fails the miller test.
Just because it fails the Miller test, does not mean you won't get convicted anyways outside of LA.
 
Only the creation and distribution of porn is illegal in South Korea. Consumption is not. I'm not sure what you are implying when you question if the "illegality" of said porn is working, when the topic at hand is CP. Could you explain what you meant?
He means that it is infeasible to enforce the ban on porn.
 
Back