US TikTok isn't protected by Section 230 in 10-year-old’s ‘blackout challenge’ death - Recommendations are now protected speech, so you're liable for them!

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
A US appeals court has issued an opinion that could have wide-ranging implications for social media platforms, finding that content selected for users by TikTok's algorithms doesn't qualify for Section 230 protection.

In an opinion [PDF] published today, a three-judge panel from the Third Circuit Court of Appeals in Pennsylvania decided that, because TikTok presented "blackout challenge" posts to 10-year-old Nylah Anderson on her For You Page of recommended content, the platform deserves to be taken to court for her death that followed.

The "blackout challenge" refers to a dangerous self-asphyxiation "trend" that went around on TikTok several years ago. Anderson attempted to participate in the challenge, leading to her death, but a lower-court judge decided in 2022 that TikTok was protected by Section 230 of the Communications Decency Act (CDA), which protects social media platforms from liability for content posted by their users.

The Third Circuit court sharply disagreed.

"TikTok knew that Nylah would watch [the blackout challenge video] because the company's customized algorithm placed the videos on her 'For You Page' after it 'determined that the Blackout Challenge was 'tailored' and 'likely to be of interest' to Nylah,'" Judge Paul Matey wrote in a partial concurrence included in the decision.

Matey argued that Section 230's application has evolved far beyond the original intent when Congress passed the CDA in 1996. It is not to "create a lawless no-man's land" of legal liability.

"The result is a Section 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm," Matey said.

Judge Patty Shwartz wrote in the main body of the opinion that the Third Circuit's reading of Section 230 is reinforced by the recent Moody v NetChoice decision from the US Supreme Court. In that case, related to content moderation laws passed in Florida and Texas, SCOTUS held that algorithms reflect editorial judgments. Shwartz wrote that it's a compilation of third-party speech made in the manner a platform chooses, and thus merits First Amendment protection.

"Given the Supreme Court's observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others' content via their expressive algorithms, it follows that doing so amounts to first-party speech under Section 230, too," Shwartz reasoned.

In short, you can't have it both ways: Either you serve everything, let users sort it out and keep that liability shield; or you make algorithmic picks that surface content, give users what you think they want and take on the liability that comes with being the arbiter of that content.

With the appeal decided, Anderson's case will head back to the District Court in the Eastern District of Pennsylvania to be re-heard.

"Today's opinion is the clearest statement to date that Section 230 does not provide this catchall protection that the social media companies have been claiming it does," Anderson family lawyer Jeffrey Goodman told the Associated Press regarding the outcome.

TikTok didn't respond to questions for this story.

It's not immediately clear what sort of broader reach the Third Circuit's opinion could have on Section 230 protections for other social media platforms, but the three-judge panel that made the call knows it's likely to be applied elsewhere: It's a precedential one.

---------------------

A very successful parent had their child die from a "blackout challenge" (choking yourself) and decided to sue TikTok for not jannying quickly enough. As much as I'm inclined to cheer the downfall of TikTok and other social media companies, I think this case sets a bad precedent - no one seriously thinks that Google search recommendations are somehow endorsed by Google as factually true, for example.

Article: https://www.theregister.com/2024/08/28/tiktok_blackout_challenge_appeal/
Archive (pending): https://archive.ph/Eg97c

The court's decision is attached.
 

Attachments

What is the difference if, instead of promoting specific posts For You, they identify posts they think you will like, and then demote all other content?
Nothing; if the "demoted" content is selected based on not being in the set selected as "things you will like". You're describing the same thing, it's barely a semantic difference and not one that would fool anybody.
But it's not a two-sided coin. Global demotion, or actually demoting "things you won't like" are distinct. That's like 3 to 5 sides for your coin, minimum.
 
Desire of Butlerian Jihad Intensifies.
Born too late to explore the planet.
Born too early to explore the stars.
Born just in time to become a prototype Mentat.
what are the bets for real ID for Social Media?
rail guard to detect & restrict the under 18?

my guess -
*month set up & seethe over having to do real ID.
* if Social media is pushing for Supreme court review year but, you will know its over if tune change about using real ID.
*betting on chat restriction & video restriction for new accounts of under 18.
...

my only disagreement is china tiktock meme-


It's always save the children. every fucking time.
Forgive my unhinged rant, but fuck them kids.
I've got enough gray to not get carded but I absolutely refuse to give up my info and ID to people who get breached every other month.
 
This ruling doesn't make any sense, how could an automated machine's behavior count as protected speech? It shouldn't count as speech at all unless it's coming from a person. Does this mean I get to sue a car manufacturer if I get into a car accident because I was using the cruise control?
 
TikTok presented "blackout challenge" posts to 10-year-old Nylah Anderson on her For You Page of recommended content, the platform deserves to be taken to court for her death that followed.
There's an easy solution I don't think anyone is seeing

If TikTok knew this girl was 10 years old then they are liable to keep the platform safe for children and regulate their algorithms to keep this kind of stuff out and should be sued into oblivion
If they TikTok thought the girl was 18+ then they aren't liable for shit
 
I don't understand this ruling or its implications, but I instinctively support anything our tech overlords are against.
1725476980654.png
1725477164966.png
Trending tab that is curated by youtube/corpo -ai/algorithim is being concidered as platform speech.
that platform speech is what caused black girl to kill herself due to knockout challenge.
Knockout challenge should of been filtered or banned due to self harm but corpo going corpo.
 
What this will probably mean for us is that Null will have to get rid of the highlighted post feature since it is algorithmically driven, in the loosest sense, to avoid any liabilities. Featured content on the front page might have to be done away with too, since that is also technically unprotected editorializing I think, since promotion of content is also editorialization. That won't be too bad though, because the Community Driven Happening Thread is still fair game.
Featuring is editorializing, sure, but highlights are not a judgment call, just pretty much areas of high activity that usually are the quality posts.

Sure, you can get to absurd degrees, but whatever algorithm decides which region of the screen needs to be updated and tells the OS's API to blit the bitmap of the area of the window displaying the text when it updates is then also liable for the content and that's just silly.
 
Ones like, "Don't try this at home. This is done by experts. If you try to replicate this you could get hurt. We are not responsible for any injury." etc etc, ones that protect them from lawsuits basically.
I have never seen YouTube append that disclaimer. At any rate, it would be for jury to decide if that negates the negligence. I'd say it would.
Will Tik Tok appeal this ruling to the Supreme court?
Because I famously work for Tik Tok.
This ruling makes implantation of said network that spies on users in network, in search of CP and they get your data it doesn't matter whenever you're innocent or guilty.
What the fuck are you on about, and how is that even remotely related to anything discussed in the case?
This ruling doesn't make any sense, how could an automated machine's behavior count as protected speech?
Complain to SCOTUS who decided that algorithms are speech.
Trending tab that is curated by youtube/corpo -ai/algorithim is being concidered as platform speech.
The 'speech' at issue was in the "for you" tab.
 
It shouldn't count as speech at all unless it's coming from a person. Does this mean I get to sue a car manufacturer if I get into a car accident because I was using the cruise control?
Not if the cruise control functions and works as described.
But if the car manufacturer forced you accept a car manufacturer employee sitting in the car with you everywhere you go, and he was the only one allowed to operate the cruise control, and he fucked up and caused the accident, yes.
If they replaced the man with a decision-making robot that also fucked up at the human function it's performing, it wouldn't change the nature of that function nor would it excuse the fuck-up.

Analogies are stupid.
 
The 'speech' at issue was in the "for you" tab.
how did the Explore tab excape the net.
edit: forgot the judge didn't know that info believees counts as "search"

Because, and this will be extremely hard to understand, but try and stick with me, the content in question was recommended to the kid's 'for you' tab, tab specifically for Tik Tok's algorithim to show you stuff it wants you to see.

This context is extremely hard to understand and would require you to have been able to read paragraph one of the order, or paragraph two and or four of the article. I realize that this is a difficult task.
1725478106552.png1725478137961.png1725478214874.png
1725478272930.png
How about age restrict social media and make the parents liable for any harm that comes to or from the child using it?
it disfranchise the black & youth voice.
 
Last edited:
  • Informative
Reactions: Flaming Insignias
What's especially retarded about this ruling is if a machine algorithm's behavior is ruled as protected speech, then that would mean the algorithm's behavior falls under the First Amendment, which means the computer has freedom of speech, therefore you can't sue it for what it says because its recommendations are protected under the First.

In their attempt to censor freedom of speech, they just extended it to unthinking mathematical expressions. That would mean that TikTok shouldn't be liable for what their algorithm recommended anyway, because if an algorithm has freedom of speech, that would mean it's considered to be endowed with personhood, and as an entity endowed with personhood, the algorithm would logically be liable for its own behavior.

Nothing about this ruling makes any sense.

Furthermore, to what degree does this ruling extend to? Does any sorting method whatsoever count as algorithmic protected speech as well? Threads with more posts are designed to climb up higher on a forum than those that don't, does this mean that this sorting method by popularity, by nature of being an automatic process, counts as protected speech authored by the owner? What about third party software purchased and implemented by the site owner, who's liable in that circumstance, the software developer or the person implementing it to their site?

This dumb ruling throws all manner of liability into chaos and none of it will make any sense because this ruling was made by activist judges.
 
Last edited:
What the fuck are you on about, and how is that even remotely related to anything discussed in the case?
Websites had 230 type protections until people sued sites where their private information was dropped.
Later one school shooting then mall shooting, lawsuits galore
Same thing different results
I should've mentioned that in the first place, we went through what is going to happening now.
 
how did the Explore tab excape the net.
Because, and this will be extremely hard to understand, but try and stick with me, the content in question was recommended to the kid's 'for you' tab, tab specifically for Tik Tok's algorithim to show you stuff it wants you to see.

This context is extremely hard to understand and would require you to have been able to read paragraph one of the order, or paragraph two and or four of the article. I realize that this is a difficult task.
I should have said: do you think they will or can they even?
They can, and they probably will.
 
Back