Science Programmer builds AI algorithm to ‘expose’ adult actresses - find out if you can watch your ex on pornhub, 100k found and counting

Yesterday, Yiqin Fu, a research associate at Yale University, tweeted a thread about a Chinese programmer who had claimed he had built an algorithm that had identified 100,000 adult actresses by cross-referencing footage from porn videos with social media profile pictures. Using this tool, they hope to help others check whether their girlfriends have ever acted in pornographic films.

A Germany-based Chinese programmer said he and some friends have identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures. The goal is to help others check whether their girlfriends ever acted in those films. pic.twitter.com/TOuUBTqXOP



— Yiqin Fu (@yiqinfu) May 28, 2019

The facial recognition reportedly tool took half a year to build and has over 100 terabytes of video data pulled from sites including Pornhub, 91, 1024, sex8, and xvideos. This was compared against profile pictures from Facebook, Instagram, TikTok, Weibo, and others.

When the software was first announced, it had around 1,000 comments — most expressing their excitement about the service with replies like “A true blessing for us credulous programmers,” “When can we use it?,” and “Wtf I love the tech future now.”

On the thread, Fu noted that the most up-voted comment asked if the OP plans on identifying the men in porn videos to which he replied how he’s open to the idea. But for legal reasons, he said he may have to “anonymize the data” before letting people query the database.

This isn’t the first time someone has used AI to identify faces in porn. In 2017, Pornhub announced that it was using machine learning and facial recognition to detect over 10,000 porn stars across the site in an effort to make it easier for users to find content they like. At the time, Motherboard argued the development was a privacy nightmare waiting to happen.

But unlike Pornhub, the intent here is much more ill-conceived. Porn stars often rely on pseudonyms to shield off their personal matters from their stage personas. From that perspective, cross-referencing porn videos with social media content could seriously endanger this boundary.

The programmer who built the tool was also asked whether he knew what sort of legal jeopardy he could be in. But claimed that everything was legal because he hasn’t shared any data or opened up the database to outside queries, and sex work is currently legal in Germany, where he’s based.

While this technology has the potential to find victims of human trafficking or other forms of sexual exploitations, that’s not the intent here. Rather, it’s a weapon for shaming women and stripping them of their privacy. One user on Fu’s thread tweeted how it won’t be long until people abuse this service to find porn stars that look similar to people they know in real life. That, combined with the sophistication of AI-generated ‘deepfake’ videos, proves the future is truly a horrific place to be a woman.



ULTRA HOT VICE TAKE ;

"Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. "

full story
DIY Facial Recognition for Porn Is a Dystopian Disaster
Someone is making dubious claims to have built a program for detecting faces in porn and cross-referencing against social media, with 100,000 identified so far.
by Samantha Cole
|
May 29 2019, 9:11am
  • facebook-square.svg
    SHARE
  • twitter.svg
    TWEET
Someone posting on Chinese social network Weibo claims to have used facial recognition to cross-reference women’s photos on social media with faces pulled from videos on adult platforms like Pornhub.

In a Monday post on Weibo, the user, who says he's based in Germany, claimed to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale.”

To be clear, the user has posted no proof that he’s actually been able to do this, and hasn’t published any code, databases, or anything else besides an empty GitLab page to verify this is real. When Motherboard contacted the user over Weibo chat, he said they will release “database schema” and “technical details” next week, and did not comment further.

Still, his post has gone viral in both China on Weibo and in the United States on Twitter after a Stanford political science PhD candidate tweeted them with translations, which Motherboard independently verified. This has led prominent activists and academics to discuss the potential implications of the technology.

According to Weibo posts, the user and some of his programming friends used facial recognition to detect faces in porn content using photos from social platforms. His reasoning for making this program, he wrote, is “to have the right to know on both sides of the marriage.” After public outcry, he later claimed his intention was to allow women, with or without their fiancées, to check if they are on porn sites and to send a copyright takedown request.

"This is horrendous and a pitch-perfect example of how these systems, globally, enable male dominance," Soraya Chemaly, author of Rage Becomes Her, tweeted on Tuesday about the alleged project. "Surveillance, impersonation, extortion, misinformation all happen to women first and then move to the public sphere, where, once men are affected, it starts to get attention."



Whether the Weibo user’s claims are trustworthy or not is beside the point, now that experts in feminist studies and machine learning have decried this project as algorithmically-targeted harassment. This kind of program’s existence is both possible and frightening, and has started a conversation around whether such a program would be an ethically or legally responsible use of AI.

Just as we saw with deepfakes, which used AI to swap the faces of female celebrities onto the bodies of porn performers, the use of machine learning to control and extort women's bodily autonomy demonstrates deep misogyny. It's a threat that didn't begin with deepfakes, but certainly reached a public sphere with that technology—although in the years since, women have been left behind in the mainstream narrative, which has focused on the technology’s possible use for disinformation.

Danielle Citron, a professor of law at the University of Maryland who's studied the aftermath of deepfakes, also tweeted about this new claim on Weibo. "This is a painfully bad idea—surveillance and control of women’s bodies taken to new low," she wrote.

What he claims to have done is theoretically possible for someone with a decent amount of machine learning and programming knowledge, given enough time and computing power, though it would be a huge effort with no guarantee of quality.

The ability to create a database of faces like this, and deploy facial recognition to target and expose women within it, has been within consumer-level technological reach for some time.

In 2017, Pornhub proudly announced new facial recognition features that it claimed would make it easier for users to find their favorite stars—and, in turn, theoretically easier for abusers or harassers to find their targets. As I wrote at the time:

Even if Pornhub deploys this technology in an ethical way, its existence should be concerning. Such technology is unlikely to stay proprietary for long, and given that some people on the internet make a habit of identifying amateur or unwitting models, the underlying tech could supercharge some of these efforts.

In 2018, online trolls started compiling databases of sex workers, in order to threaten and out them. This harassment campaign had real-life consequences, with some sex workers having their payment processors or social media platforms shut down.

What this Weibo programmer is claiming to have built is a combination of these two ideas: A misogynistic, abusive attempt at controlling women. Whether it's real or not, it's representative of the dark paths where machine learning technology—and some of the societal toxicity around it—has taken us.
Jordan Pearson contributed reporting to this story.


original tweet that "broke" this story. The comments are a gold mine
777383


The arbyocalypse draws closer
 
And if the exact same thing existed to find people who had ever voted Republican, they'd be all for it.
China and England have been employing the same kind of tech to find people who did wrong. These people didn't care then, either. Even though England's has been pretty dodgy and got over 2k wrong matches.
 
As long as she wasn't your girlfriend when she acted in the films why would anybody give a fuck? Gotta pay the bills

The VICE authors are pathetic lolcows, but anyone going to the trouble to make an algorithm just do dox porn actresses is probably a bigger lolcow imo
 
As long as she wasn't your girlfriend when she acted in the films why would anybody give a fuck? Gotta pay the bills

The VICE authors are pathetic lolcows, but anyone going to the trouble to make an algorithm just do dox porn actresses is probably a bigger lolcow imo

he says kissing his loving wife, unwittingly getting the full viral biodiversity of 50 football teams on his lips
 
At least saucefags on /gif/ will shut the fuck up now.
Implying any of them would take the time to use this rather than just asking for sauce.
777395


As long as she wasn't your girlfriend when she acted in the films why would anybody give a fuck?
IMO It shows poor short-term judgement and a lack of self-respect that would be indicative of someone that I wouldn't want to be in a relationship with.
 
experts in feminist studies and machine learning
:story:

This could be useful to know who is good for a quick screw and who is relationship material. Do it to dudes too, the more the merrier. Hell, LINK me to the porn so I can send it to their parents, significant other and children.
 
IMO It shows poor short-term judgement and a lack of self-respect that would be indicative of someone that I wouldn't want to be in a relationship with.
Somebody's gotta be in porn. Personally I don't see what the issue is with it, we choose to judge folks for it, it's not some natural law of the universe that being an actress in a porno makes you a bad person. It's just a job, I don't see why people make such a big deal out of it just cause sex is involved. It's just sex, why do people make such a big deal about it?
 
Somebody's gotta be in porn. Personally I don't see what the issue is with it, we choose to judge folks for it, it's not some natural law of the universe that being an actress in a porno makes you a bad person. It's just a job, I don't see why people make such a big deal out of it just cause sex is involved. It's just sex, why do people make such a big deal about it?
Are you a jigaloo
 
Somebody's gotta be in porn. Personally I don't see what the issue is with it, we choose to judge folks for it, it's not some natural law of the universe that being an actress in a porno makes you a bad person. It's just a job, I don't see why people make such a big deal out of it just cause sex is involved. It's just sex, why do people make such a big deal about it?

I get your logic but sex work brings up legitimate concerns of things like potential diseases or the mundanity of intercourse (which is also a problem with people who sleep around too much). There's also always the fucked up underground shit where some actors are unwilling and victims of trafficking. Hell now that I think about it this may be useful for law enforcement.
 
Somebody's gotta be in porn.
There doesn't have to be porn, especially on the industrial level that we (myself unfortunately included) imbibe it, but even if there for some reason always had to be porn that wouldn't change the fact that the act would be viewed as either positive or negative depending on the norms of the society.
t's not some natural law of the universe that being an actress in a porno makes you a bad person
I find the word "bad" pretty unhelpful for this discussion. The point is that being a porn star does indicate personality traits that some would find unattractive in a perspective partner.
It's just a job
That's a stupid statement. People treat/view different jobs in different ways. You wouldn't look at the gas station cashier the same way as you would a lawyer.
just cause sex is involved
Sex is an incredibly intimate and important aspect of life for most people. Rampant promiscuity is looked down upon by virtually every major culture in human history for a reason.

@ProgKing of the North Nobody's saying you can't be a pornstar but people have the right to treat you accordingly - and that includes not wanting to date you.
 
Back