Law Twitter refused to remove child porn because it didn’t ‘violate policies’: lawsuit - Right-wingers? Against TOS. CP? @Jack approves.

Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim because an investigation “didn’t find a violation” of the company’s “policies,” a scathing lawsuit alleges.

The federal suit, filed Wednesday by the victim and his mother in the Northern District of California, alleges Twitter made money off of the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn, the suit states.

The teen — who is now 17 and lives in Florida — is identified only as John Doe and was between 13 and 14 years-old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat, the suit alleges.

Doe and the traffickers allegedly exchanged nude photos before the conversation turned to blackmail — if the teen didn’t share more sexually graphic photos and videos, the explicit material he’d already sent would be shared with his “parents, coach, pastor” and others, the suit states.

Doe, acting under duress, initially complied and sent videos of himself performing sex acts and was also told to include another child in his videos, which he did, the suit claims.

Eventually, Doe blocked the traffickers and they stopped harassing him but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.

Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.

Doe became aware of the tweets in Jan. 2020 because they’d been viewed widely by his classmates, which subjected him to “teasing, harassment, vicious bullying” and led him to become “suicidal,” court records show.

While Doe’s parents contacted the school and made police reports, he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.

A support agent followed up and asked for a copy of Doe’s ID so they could prove it was him and after the teen complied, there was no response for a week, the family claims.

Around the same time, Doe’s mother filed two complaints to Twitter reporting the same material and for a week, she also received no response, the suit states.

Finally on Jan. 28, Twitter replied back to Doe and said they wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets, the suit states.

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response reads, according to the lawsuit.

“If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

In his response, published in the complaint, Doe appeared shocked.

“What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down,” the teen wrote back to Twitter.
He even included his case number from a local law enforcement agency but still, the tech giant allegedly ignored him and refused to do anything about the illegal child sexual abuse material — as it continued to rack up more and more views.

Two days later, Doe’s mom was connected with an agent from the Department of Homeland Security through a mutual contact who successfully had the videos removed on Jan. 30, the suit states.

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” the suit, filed by the National Center on Sexual Exploitation and two law firms, states.

“This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”
The disturbing lawsuit goes on to allege Twitter knowingly hosts creeps who use the platform to exchange child porn material and profits from it by including ads interspersed between tweets advertising or requesting the material.

Twitter declined to comment when contacted by The Post.
 
I have questions.
167,000 views and 2,223 retweets seems like a lot.
It was uploaded twice.
No one else reported it to authorities? But it was also widespread enough and up long enough for his classmates to find and identify him from? What hashtag did they search?
I imagine that Twitter’s auto-moderation is partly to blame. It’s also possible the images and videos were edited or censored in such a way that it wasn’t obvious what was going on.
Twitter being a haven for sex workers doesn’t help. If they had policies against and actively removed porn, they wouldn’t have to worry about stuff like this.
 
Remember when it was Pornhub getting banned from payment processors for legal porn and not taking down illegal porn faster?
Good times.

Twitter refusing to take down CP and making money off ads *shrugs*.

🌈How long until Twitter gets banned from payment processors? 🌈
How long before Twitter advocates for fair banking?
 
Remember when it was Pornhub getting banned from payment processors for legal porn and not taking down illegal porn faster?
Good times.

Twitter refusing to take down CP and making money off ads *shrugs*.

🌈How long until Twitter gets banned from payment processors? 🌈
I'm just glad it was one of the big Silicon Valley tech companies that did this and not some other site like Gab or Parler. The last thing we need is more people calling for the Internet to be censored.
 
I'm just glad it was one of the big Silicon Valley tech companies that did this and not some other site like Gab or Parler. The last thing we need is more people calling for the Internet to be censored.
CP gets posted on every SM site, the issue is the cooperate response.
What retards on the Pornhub thread couldn't understand is that PH did far more than Twitter or Facebook ever has in deleting illegal or even questionable content.
Sites like Gab and Parler know from the start they are going to be under the microscope and are far more stringent to avoid press like this(see: Gab hentai jihad).
 
So... Has anyone bothered to look into the backgrounds of the Twitter higher ups?
There's no way they don't have kiddy porn on their laptops.
I feel like people started saying this when Twitter freaked out about Pizzagate, though mostly in a cynical, somewhat joking way. I don’t think any one thinks it’s a joke these days.
 
Unpopular opinion, but when are we going to hold teenagers responsible for doing this bullshit? By this point, you definitely do know better than to not post your nudes around whether it's because, hello, YOU'RE A MINOR, or because you simply should just already be aware of how volatile the open Internet is and will be.
The idiot was purposefully posing as a 16-year-old, and no doubt thought nothing bad until it turned against them. Don't care if this makes me seem like a CP or TOS apologist, again, there's no excuse to not be aware. There's no excuse for not parenting properly, ect.
 
Unpopular opinion, but when are we going to hold teenagers responsible for doing this bullshit? By this point, you definitely do know better than to not post your nudes around whether it's because, hello, YOU'RE A MINOR, or because you simply should just already be aware of how volatile the open Internet is and will be.
The idiot was purposefully posing as a 16-year-old, and no doubt thought nothing bad until it turned against them. Don't care if this makes me seem like a CP or TOS apologist, again, there's no excuse to not be aware. There's no excuse for not parenting properly, ect.
That falls on the parents. We do not need more kids in juvie, or more bureaucrats doing the parenting for us.
 
That falls on the parents. We do not need more kids in juvie, or more bureaucrats doing the parenting for us.
I don't want kids going to juvie for sharing illegal pictures of themselves, but I am definitely tired of this blame shifting towards only the adults and major corporations in this case. The only reason why this kid got into the situation they were in is only because they personally took the time to go into their bedroom/private bathroom/fucking somewhere, willingly stripped, and willingly took these pictures and sent them to someone else. Again, the retard went as far as to say they were 16 probably to bypass any Romeo/Juliet laws Florida has. It obviously isn't the same as some fucked up person kidnapping a child and forcing them to do similar, but why treat these two situations/contexts like they're on the same level or even "identical"?

Don't really want to PL either here but most I'll say is I've done this sort of shit. Not proud of it, wish I could even turn back time to "fix" it. As much as my parents are to blame I really don't see a reason to solely put it on them. I was more than aware of my actions and that I shouldn't do that shit, so I know for a fact that other teens are the same way. Nobody ever wants to admit this shit, so I will.
 
1611261757766.png
.
.....
edit : side note been tos for year but no enforced because law
1611261888805.png
 
Back