KR With a click, 220,000+ member Korean Telegram room generates illegal deepfake porn - A Telegram group was discovered by the activists in which men would make AI Deep Fakes of various women, some of who are minors, without their consent

Link | Archive
Note: I will add additional information at the bottom of this post since the article doesn't cover everything

Experts say the channel’s profit structure suggests a high level of demand and a lack of understanding that the images it creates are illegal
1724784145203.png
Screenshots from inside the over 220,000-member Telegram chat room that generated deepfakes of women using AI technology.

Pornographic deepfakes, or digitally altered images generated through AI technology, are being widely distributed on the Telegram messaging app in Korea, including on one Telegram channel that produces deepfake nude images on demand for over 220,000 members. The channel, which was easily accessed through a basic online search, charges money for the fabricated images, which are based on photographs of actual people. Experts say the channel is a stark illustration of the current state of deepfake pornography — which many people are not even aware is illegal.

A Telegram channel that the Hankyoreh accessed through a link on X (formerly known as Twitter) on Wednesday features an image bot that converts photographs of women uploaded to the channel into deepfake nude images. When the Hankyoreh entered the channel, a message popped up asking the user to “upload a photograph of a woman you like.”

The Hankyoreh uploaded an AI-generated photograph of a woman and within five seconds, the channel generated a nude deepfake of that photograph. The deepfake tool even allows users to customize body parts in the resulting image. As of Wednesday, Aug. 21, the Telegram channel had some 227,000 users.

The Telegram channel was very easy to access. A search for specific terms on X and other social media brought up links to the channel, and one post with a link was even featured as a “trending post” on X. Despite recent coverage of sex crimes involving deepfakes and related police investigations, posts are still going up to promote the channel.

The Telegram channel generates up to two deepfake nudes for free but requires payment for further images. Each photograph costs one “diamond” — a form of in-channel currency worth US$0.49, or about 650 won — and users are required to purchase a minimum of 10 diamonds, with discounts available for bulk purchases. Diamonds are also given out for free to users who invite friends to the channel, in an obvious attempt to broaden the user base. Cryptocurrency is the only accepted form of payment, likely for anonymity reasons.

The chat room does not allow users to send messages or pictures to one another. The trouble is that there is no way of knowing how the deepfakes the channel generates are being used by members.

“Sexually exploitative deepfake images created in Telegram rooms like these get shared in other group chat rooms, that that’s how you end up with mass sex crimes,” said Seo Hye-jin, the human rights director at the Korean Women Lawyers Association.

“If there are over 220,000 people in the Telegram room at the stage where these images are being manufactured, the damage from distribution is likely to be massive,” she assessed.

The existence of such a huge Telegram channel with a revenue-generating model is likely a reflection of the reality in which creating highly damaging deepfakes is seen as no big deal.

“The fact that there’s a profit structure suggests that there’s a high level of demand,” said Kim Hye-jung, the director of the Korea Sexual Violence Relief Center.

“Despite the fact that sexually degrading women has become a form of ‘content’ online, the social perceptions of this as a minor offense is a key factor that plays into sex crimes,” she went on.

By Park Go-eun, staff reporter

Telegram rooms were discovered in over 70% of South Korean schools where female students' faces were photoshopped into porn using AI. Girls in South Korea created a list of schools to check if they were victims. This is just a part of it
1724784362607.png
1724784402470.png1724784424319.png1724784441660.png
The sample of the chat room from their Telegram:
1724784557459.png
A feminist in South Korea has mapped out schools where deepfake child pornography was created by male students using photos of girls on Telegram. The Map was based on the list you saw earlier. The website: https://deepfakemap.xyz/
1724784681700.png
 
Report me then faggot, enough with the back-seat modding.
Nah, I think you've got a chance at seeing that your points are retarded and go completely contrary to the existing law. Suel was simply too much of a schizo and had to be put down
 
  • Like
Reactions: Wright
I was under the impression that they refused to have children due to shitty living standards, not necessarily because of "feminism". Which, tbh, is based, and an amazing way to pressure the government
I don't think letting yourself go extinct out of spite is the epic own to the government that you seem to think it is
 
> kpop
> most feminised country in the world, literally no testosterone can be found in this shit nation
> highest number of plastic surgery cases per capita in the world
> a country that needed a law that states: for any device that can take any pictures must make a sound of at least 64 decibels
> ...
> said country now has a deepfake porn problem

damn, that's crazy.
 
Who MIGHT be harmed if the images come out. Reputational damage is a characteristic of the social contract, making the most sensible way to handle this to be for society to not shame underage girls who get deepfakes made of them, indeed to shame the people who make them instead. Ironically there was a similar issue almost exactly 100 years ago when realistic paintings of women were mass produced for calendars and postcards and the like. Somehow we got through it without banning people from drawing naked women. Of course back them half of us weren't raised by daddy government.
You would have defended Shadman during the Dafne Keen cease and desists.

You disgust me.
 
If anything this would be a reason to increase regulation on AI. There is no fucking chance on God's green earth that AI child porn will be legal
That is precisely the goal here. Bait all the retard women and white knights into voting to sabotage the future of Western mathematics even more.
Who MIGHT be harmed if the images come out. Reputational damage is a characteristic of the social contract, making the most sensible way to handle this to be for society to not shame underage girls who get deepfakes made of them, indeed to shame the people who make them instead. Ironically there was a similar issue almost exactly 100 years ago when realistic paintings of women were mass produced for calendars and postcards and the like. Somehow we got through it without banning people from drawing naked women. Of course back them half of us weren't raised by daddy government.
It would be better to bring back the advice that used to be given to young Internet users before Lifelog (now known as Facebook) became popular in the mid-2000s: Share absolutely nothing about your real life online. No pictures, no real name, no personal life details. The entire reason that deepfake CP became possible in the first place is because it became normal to broadcast your real life identity to the entire world.
 
Last edited:
I'd still ignore it and I'd have the plausible deniablilty of calling it a deepfake. You (and your sockpuppets in this thread) and clearing keen on banning AI in general and keeping the masses weak and dumb.
Nigger, I can count three people (not counting you and the other pedophile) who are mostly cool with AI on this page.

Null you got more pedophile-apologists
 
90% of girls in their teens look like shit, hell most people look like shit going through puberty. Wacky chuby face, uneven breasts, maxipad stank, teeth, etc...
Degenerates....
Micropenis....
It depends, some are pretty, and it also isn't like a pedo thinks "woah man, that child is super hot, just look at those brackets and flat-as-a-board chest", they get off to harming an small child, how the child looks like is largely irrelevant. Always remember that
 
A law means nothing if it can't be enforced. It is likely that making GPU licenses a thing will cause too much collateral damage though ($NVDA will tank, rich investors will absolutely lose their shit,) so widespread mandatory surveillnace technology embedded inside consoomer-grade operating systems is a much more likely outcome.
Biden made a broad law that makes it illegal to use datasets with racial/sexist/whatever bias for machine learning (meaning you can't use fbi crime statistics to train an ai).
The most likely outcome is that companies in bed with the US govt can develop what they want, but everyone else will need to have some sort of auditable, open source data sets that conform to government guidelines.
This article just provides a convenient "think of the children" excuse for it.
 
Biden made a broad law that makes it illegal to use datasets with racial/sexist/whatever bias for machine learning (meaning you can't use fbi crime statistics to train an ai).
The most likely outcome is that companies in bed with the US govt can develop what they want, but everyone else will need to have some sort of auditable, open source data sets that conform to government guidelines.
This article just provides a convenient "think of the children" excuse for it.

There is also this California bill that will make AI companies liable for any damages that users of their models cause. All of this is deliberate sabotage by the way.
 
Can you explain to me how that is not hilarious? Growing a culture so competitive that you can't even afford the time to bear children is pure comedy, you can't say otherwise.
I think it's more depressing than anything, especially for a nation that has great potential otherwise, but different shades and all that
 
  • Like
Reactions: Wright
2014: losers spend all day photoshopping women into porn
2024: losers ai generate woman into porn

Same worthless shit, but faster.
No it's not, because there's no barrier of entry to use AI. With photoshop, you had to learn how the software worked and be proficient at using it. With AI, you just type in your order and presto!
 
I can truly see it. There's laws already put in place to punish this, so it isn't as if they will need to pass to combat it. I think that the false dichotomy of "either we need to pass new laws that force an ID on everything or there will be CSAM flooding all of the net" is manufactured precisely to persuade the kind of internet libertarian that won't let the goverment get bigger, and thus, inadvertedly, making him support the normalization of AI CSAM.
Pedophiles really are the worst thing on this gay Earth. Not only do they gleefully defile the concept of childhood innocence, they're also the perfect tool for a tyrannical government to grab more power.
 
Back