Science Programmer builds AI algorithm to ‘expose’ adult actresses - find out if you can watch your ex on pornhub, 100k found and counting

Yesterday, Yiqin Fu, a research associate at Yale University, tweeted a thread about a Chinese programmer who had claimed he had built an algorithm that had identified 100,000 adult actresses by cross-referencing footage from porn videos with social media profile pictures. Using this tool, they hope to help others check whether their girlfriends have ever acted in pornographic films.

A Germany-based Chinese programmer said he and some friends have identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films. pic.twitter.com/TOuUBTqXOP



— Yiqin Fu (@yiqinfu) May 28, 2019

The facial recognition reportedly tool took half a year to build and has over 100 terabytes of video data pulled from sites including Pornhub, 91, 1024, sex8, and xvideos. This was compared against profile pictures from Facebook, Instagram, TikTok, Weibo, and others.

When the software was first announced, it had around 1,000 comments — most expressing their excitement about the service with replies like “A true blessing for us credulous programmers,” “When can we use it?,” and “Wtf I love the tech future now.”

On the thread, Fu noted that the most up-voted comment asked if the OP plans on identifying the men in porn videos to which he replied how he’s open to the idea. But for legal reasons, he said he may have to “anonymize the data” before letting people query the database.

This isn’t the first time someone has used AI to identify faces in porn. In 2017, Pornhub announced that it was using machine learning and facial recognition to detect over 10,000 porn stars across the site in an effort to make it easier for users to find content they like. At the time, Motherboard argued the development was a privacy nightmare waiting to happen.

But unlike Pornhub, the intent here is much more ill-conceived. Porn stars often rely on pseudonyms to shield off their personal matters from their stage personas. From that perspective, cross-referencing porn videos with social media content could seriously endanger this boundary.

The programmer who built the tool was also asked whether he knew what sort of legal jeopardy he could be in. But claimed that everything was legal because he hasn’t shared any data or opened up the database to outside queries, and sex work is currently legal in Germany, where he’s based.

While this technology has the potential to find victims of human trafficking or other forms of sexual exploitations, that’s not the intent here. Rather, it’s a weapon for shaming women and stripping them of their privacy. One user on Fu’s thread tweeted how it won’t be long until people abuse this service to find porn stars that look similar to people they know in real life. That, combined with the sophistication of AI-generated ‘deepfake’ videos, proves the future is truly a horrific place to be a woman.



ULTRA HOT VICE TAKE ;

"Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. "

full story
DIY Facial Recognition for Porn Is a Dystopian Disaster
Someone is making dubious claims to have built a program for detecting faces in porn and cross-referencing against social media, with 100,000 identified so far.
by Samantha Cole
|
May 29 2019, 9:11am
  • facebook-square.svg
    SHARE
  • twitter.svg
    TWEET
Someone posting on Chinese social network Weibo claims to have used facial recognition to cross-reference women’s photos on social media with faces pulled from videos on adult platforms like Pornhub.

In a Monday post on Weibo, the user, who says he's based in Germany, claimed to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale.”

To be clear, the user has posted no proof that he’s actually been able to do this, and hasn’t published any code, databases, or anything else besides an empty GitLab page to verify this is real. When Motherboard contacted the user over Weibo chat, he said they will release “database schema” and “technical details” next week, and did not comment further.

Still, his post has gone viral in both China on Weibo and in the United States on Twitter after a Stanford political science PhD candidate tweeted them with translations, which Motherboard independently verified. This has led prominent activists and academics to discuss the potential implications of the technology.

According to Weibo posts, the user and some of his programming friends used facial recognition to detect faces in porn content using photos from social platforms. His reasoning for making this program, he wrote, is “to have the right to know on both sides of the marriage.” After public outcry, he later claimed his intention was to allow women, with or without their fiancées, to check if they are on porn sites and to send a copyright takedown request.

"This is horrendous and a pitch-perfect example of how these systems, globally, enable male dominance," Soraya Chemaly, author of Rage Becomes Her, tweeted on Tuesday about the alleged project. "Surveillance, impersonation, extortion, misinformation all happen to women first and then move to the public sphere, where, once men are affected, it starts to get attention."



Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. This kind of program’s existence is both possible and frightening, and has started a conversation around whether such a program would be an ethically or legally responsible use of AI.

Just as we saw with deepfakes, which used AI to swap the faces of female celebrities onto the bodies of porn performers, the use of machine learning to control and extort women's bodily autonomy demonstrates deep misogyny. It's a threat that didn't begin with deepfakes, but certainly reached a public sphere with that technology—although in the years since, women have been left behind in the mainstream narrative, which has focused on the technology’s possible use for disinformation.

Danielle Citron, a professor of law at the University of Maryland who's studied the aftermath of deepfakes, also tweeted about this new claim on Weibo. "This is a painfully bad idea—surveillance and control of women’s bodies taken to new low," she wrote.

What he claims to have done is theoretically possible for someone with a decent amount of machine learning and programming knowledge, given enough time and computing power, though it would be a huge effort with no guarantee of quality.

The ability to create a database of faces like this, and deploy facial recognition to target and expose women within it, has been within consumer-level technological reach for some time.

In 2017, Pornhub proudly announced new facial recognition features that it claimed would make it easier for users to find their favorite stars—and, in turn, theoretically easier for abusers or harassers to find their targets. As I wrote at the time:

Even if Pornhub deploys this technology in an ethical way, its existence should be concerning. Such technology is unlikely to stay proprietary for long, and given that some people on the internet make a habit of identifying amateur or unwitting models, the underlying tech could supercharge some of these efforts.

In 2018, online trolls started compiling databases of sex workers, in order to threaten and out them. This harassment campaign had real-life consequences, with some sex workers having their payment processors or social media platforms shut down.

What this Weibo programmer is claiming to have built is a combination of these two ideas: A misogynistic, abusive attempt at controlling women. Whether it's real or not, it's representative of the dark paths where machine learning technology—and some of the societal toxicity around it—has taken us.
Jordan Pearson contributed reporting to this story.


original tweet that "broke" this story. The comments are a gold mine
777383


The arbyocalypse draws closer
 
While this technology has the potential to find victims of human trafficking or other forms of sexual exploitations, that’s not the intent here.
Journos wouldn't give a fuck even if that was the intent. When the make-up removal app had exactly that goddamned intent, you cunts made sure to keep that part quiet while women were pissing their panties. Saving women and children's lives isn't as important to you fucks than generating cryticles that hurt the first-world woman's fee-fees so you can pretend you care about them so much.
 
If they choose not to date them, for any reason, that would mean they're not compatible. Whether or not you think that's a legitimate reason doesn't factor into that. Also it's not hypocritical to not want to fuck/get into a relationship with a pornstar, a large part of (if not nearly the entire industry) pornography is that you're not fucking them.
Some people don't like to think about sex, or porn, because it's naughty. So they go with the first thing that comes to mind - 'I bet she's like a crayon in a windsock' But also sometimes they don't want to think porn and sex are icky. So they go with the first positive thought - porn stars are hot...? That won't fly - ok they make money.
 
I love this so much. This was on /pol/. Some /pol/lack found this:
So if a chick 13 years ago can look up your dating history, why not your porn history, roasties?
 
we’re a forum for making fun of people. I choose to make fun of people who think having ever been in a porn is a dealbreaker
You're not actually making fun of them, though. You're not putting out anything witty or snarky you're just saying "I disagree with them!" and then expecting head pats.
 
This has become an exceptional thread. Especially considering that the original article can't possibly be true. Maybe for white people it would work, sure, but not for Chinese people. They all look the same, and there's no way to tell them apart.
Then how do Chinese facial recognition software systems work?
 
Surprise surprise, facial recognition is the best thing ever when we can use it to stop people from getting jobs or credit cards because they once attended their local college's meeting of the young republicans, but suddenly it's intolerable harassment when turned on women who fuck around in porn.

This is very reminiscent to me of the "Right to be forgotten". Which I think was a thing being discussed in the UK a few years ago; I don't know if those words are still used for it but I'm sure it'll sound familiar: supposedly private citizens should be able to petition a court to remove unflattering things about themselves from the internet and exact punishment on anyone who violates their wishes by taking about it. Now this was before Trump got elected, and of course in Current Year the idea would be seen as ludicrous if you suggested that we shouldn't be ruining the lives of people who, say, got harassed and intimidated by crazy black supremacists and then sheepishly smiled on camera. Oh no, surely that skewed 10-second clip of video should be trotted out every time that teenager wants to apply to college or for a job or a residence for the next 50 years. But for Vice and friends, these protections of course apply to people on the right side of history. When we complain that facial recognition will be abused to curtail people's rights and quality of life they scoff and laugh at us, until the second one of their preferred groups is vulnerable to it, then they act like they were always wary of it and pretend that we weren't warning them in the first place. Rights based on feelings only ever apply to the good people we decide deserve feelings and no one else. (Any brits know the specifics of that right to be forgotten thing I'm talking about?)

Not to mention that if you've done pornography, and they can find it using this method, that would mean it's public. You're essentially complaining about public information, and at that point I don't see how you wouldn't have a problem with this site's very existence.
It is very much like people who complain of "doxing" when all that happened is someone pointed out info about you that you already plastered on facebook and instagram and everywhere else. You made the choice to take a dick in your ass on the internet. If you walk around town with your big pet elephant on a leash, why is it everyone else's responsibility to not take pictures of it and say "look at that lady with her pet elephant"? It's especially jarring messaging when in all other conversations Vice and friends are so unwaveringly uncompromising on the dignity and nobility of sex work. Is filmed prostitution something to be proud of or not? You can say "I did sex work" and I'm supposed to clap and praise you, but if I say "you did sex work" I've somehow shamed you? Which is it?

So if a chick 13 years ago can look up your dating history, why not your porn history, roasties?
This is a damn good point. For years now there have been sites and apps for women to track guys who are "bad daters". Like a fucking Yelp for ruining men's dating credit score. They get controversy but the Vices of the world always stand behind a woman's right to put a man on blast if there is anything in his past she doesn't approve of. Where is the precious right to be forgotten then?
 
Last edited:
I just find the fact that somebody would refuse to date someone they were otherwise compatible with because of the job they used to have to be funny.
But you could apply this to literally anything and end up with the same level of argument.
"He was otherwise compatible but he was a diaper fetishist."
"She was otherwise compatible but she was a zoophile."
"He was otherwise compatible but he was the guy in 1 guy 1 jar."
Multiple bad traits are associated with being in porn, whether it's addiction, low alcohol tolerance, lack of parental upbringing, or just being a parasitic Instagram skank. There are only a couple of situations where this technology could be harmful, such as misidentifying revenge porn, but I don't understand why you're in favor of hiding things that many people would find disqualifying in a partner (are they an addict, are they a boozehound, do they have bad familial relationships, are they a vile person in general).
I think you usually have good takes but please don't make a video about how there's nothing inherently wrong with being a cuckold.
 
I also feel like if your S/O did porn it may be important to you what kind of porn they did.

Did they kiss another girl gently and lovingly while soft music played?

Or were they dressed as a 6 year old girl (complete with Diaper/pacifier/pigtails) screaming about what a whore they are while 4 black dudes take turns on her with scalding her with a fire iron.

Not all porn is the same, after all.
 
Last edited:
Back