US TikTok isn't protected by Section 230 in 10-year-old’s ‘blackout challenge’ death - Recommendations are now protected speech, so you're liable for them!

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
A US appeals court has issued an opinion that could have wide-ranging implications for social media platforms, finding that content selected for users by TikTok's algorithms doesn't qualify for Section 230 protection.

In an opinion [PDF] published today, a three-judge panel from the Third Circuit Court of Appeals in Pennsylvania decided that, because TikTok presented "blackout challenge" posts to 10-year-old Nylah Anderson on her For You Page of recommended content, the platform deserves to be taken to court for her death that followed.

The "blackout challenge" refers to a dangerous self-asphyxiation "trend" that went around on TikTok several years ago. Anderson attempted to participate in the challenge, leading to her death, but a lower-court judge decided in 2022 that TikTok was protected by Section 230 of the Communications Decency Act (CDA), which protects social media platforms from liability for content posted by their users.

The Third Circuit court sharply disagreed.

"TikTok knew that Nylah would watch [the blackout challenge video] because the company's customized algorithm placed the videos on her 'For You Page' after it 'determined that the Blackout Challenge was 'tailored' and 'likely to be of interest' to Nylah,'" Judge Paul Matey wrote in a partial concurrence included in the decision.

Matey argued that Section 230's application has evolved far beyond the original intent when Congress passed the CDA in 1996. It is not to "create a lawless no-man's land" of legal liability.

"The result is a Section 230 that immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm," Matey said.

Judge Patty Shwartz wrote in the main body of the opinion that the Third Circuit's reading of Section 230 is reinforced by the recent Moody v NetChoice decision from the US Supreme Court. In that case, related to content moderation laws passed in Florida and Texas, SCOTUS held that algorithms reflect editorial judgments. Shwartz wrote that it's a compilation of third-party speech made in the manner a platform chooses, and thus merits First Amendment protection.

"Given the Supreme Court's observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others' content via their expressive algorithms, it follows that doing so amounts to first-party speech under Section 230, too," Shwartz reasoned.

In short, you can't have it both ways: Either you serve everything, let users sort it out and keep that liability shield; or you make algorithmic picks that surface content, give users what you think they want and take on the liability that comes with being the arbiter of that content.

With the appeal decided, Anderson's case will head back to the District Court in the Eastern District of Pennsylvania to be re-heard.

"Today's opinion is the clearest statement to date that Section 230 does not provide this catchall protection that the social media companies have been claiming it does," Anderson family lawyer Jeffrey Goodman told the Associated Press regarding the outcome.

TikTok didn't respond to questions for this story.

It's not immediately clear what sort of broader reach the Third Circuit's opinion could have on Section 230 protections for other social media platforms, but the three-judge panel that made the call knows it's likely to be applied elsewhere: It's a precedential one.

---------------------

A very successful parent had their child die from a "blackout challenge" (choking yourself) and decided to sue TikTok for not jannying quickly enough. As much as I'm inclined to cheer the downfall of TikTok and other social media companies, I think this case sets a bad precedent - no one seriously thinks that Google search recommendations are somehow endorsed by Google as factually true, for example.

Article: https://www.theregister.com/2024/08/28/tiktok_blackout_challenge_appeal/
Archive (pending): https://archive.ph/Eg97c

The court's decision is attached.
 

Attachments

In theory this should lead to greater free speech but in reality I hope we all know that it will lead to even MORE burying of controversial content like guns, politically sensitive topics, suicide, etc. YouTube already lowers the chance of that stuff appearing in the algorithm and slaps giant warning labels on it, now they might force you to search for it to even find it all if not outright banning it. I think this decision is going to have seriously bad reverberations.

Exactly. I'm trying to figure out why so many people here seem to think this is a win or something.

How can you have any algorithm if you are liable for the content within. The home page is an algorithm, the search results are an algorithm, ratings sort is algorithm, autoplay is an algorithm... Do you expect that Google will shutter and gimp half their products? Do you people think TikTok actually boosted 'blackout' content, or just failed to censor it? How is this anything but a ruling that undermines 230 in favor of stringent censorship/content vetting?
 
My thoughts too, I know big platforms have elaborate algorithms and if those are not under full protection of 230, it fucks with big platforms. Specifically the platform, not the user and I don't mind that, but on the other hand, I am not very smart and I might be missing the full picture, because KF is also affected and generally the big platforms give much less of a fuck and it's the smaller places that are more burdened with adapting to the new bullshit.
My immediate reaction is that it’ll just launch an avalanche of bullshit lawsuits, but I’m more than willing to be proven wrong.
 
parent only started caring about the kid once they died, i guess?
this is why tiktok moms piss me off now, I babysat for some kids once and they were glued to their phones with acrylic nails and Stanley cups. I know Stanley cups arent that bad but acrylic nails cant be good for little kid fingers. the parents don't care, they just want their kids to shut up so they can make OF content
 
I know it's massively faggy that Facebook et al get to just boot controversial opinions from their service. But in my eternal pessimism about tech and legislature, I don't trust our system to set up rules that enable us to impose rules on Facebook without giving a cause of action to outright bankrupt the farms with litigation.
This case isn't about deleting content. It is about the social media companies intentionally driving views to some content over other content, which is editorializing.

contrast that to Kiwifarms, which presents the same experience to everyone. The customized feeds are user-created. The site does not take a stance on which content should be presented, it is a repository of content.

yeah I get that but I think editorial responsibility really depends on how the algorithm is constructed. like when youtube shows me troon videos that is youtube editorial choice making people watch a thing. when it shows me ordinary nerd shit it is probably site users promoting the content themselves, like stickers featuring a post here.
It is one thing to sort the existing content being presented. That is not an algorithm in the way most people use the term. It is another to take a subset of that content and promote it over other content, as well as deprioritizing other content. Once you depart from a unified ordering of that content you are making editorial judgments.
 
Legally obligated content removal would fall under compelled speech and would not qualify as the editorial approval or algorithmic curation that is the platform operator's speech.

Section 509 is of dubious necessity. It is also worded such that the liability protection applies only to restricting access and availability to content (and even then, only certain types and classes of content). Promoting content (i.e., the "For You" feeds) would not be protected under Section 509, consistent with the court's ruling.
I disagree with the whole framing of this. To promote is also to demote. They are two sides of the same coin. If they have the power to restrict access, remove and demote content without being liable for the content as 230 says, then that means they have implicit power to do the opposite without being liable. Instead of calling it a "For You" feed, TikTok could have called it a "Not showing you anything that is not For You", and it would be identical. What is the difference if, instead of promoting specific posts For You, they identify posts they think you will like, and then demote all other content?
 
Okay help me put here because I'm retarded and this thought just came in my head. There are vids on youtube or tiktok that start with disclaimers, are those disclaimers now voided because of this or
 
  • Like
Reactions: YarrBlueballs
Is this good or bad?
depends on if you gaining the system.
Rip Al-g-rhythm shot in street due TikTok brain rot by Chinese.
1725472718057.png
born 2021- died 2024
...
Side note: well as long as this civil - Porn hub , xHamster, onlyfans might get roped in if this issue stays about al. ( radicalize Rape culture)-*sigh*
 
Can I sue youtube shorts any time it shows me a tranny?
If you can find a negligent way it caused you harm, you might be able to.
There are vids on youtube or tiktok that start with disclaimers, are those disclaimers now voided because of this or
What kind of disclaimers?
 
Big tech companies (especially tiktok) and their algorithms are so hated that they can be used as a handy excuse to destroy free speech on the internet as a whole.
Even beyond that, outsourcing moderation to algorithms is a horrible, horrible idea that nevertheless every platform did

And now we get to clean up after them
 
Modern American parenting has been a long term global disaster.
Not calling you out specifically but this is just American exceptionalism thinking.

America didn't infect the world with iphone/ipad kids, you were followers. In Asian countries like Korea/China/Philippines it's been the norm for parents to have been letting the PCs/phones raise their kids since net cafes were invented in the 90's. Then it went to the first smart phones. Then ipads. America has nothing on the level of weaponized dissociative kid raising that Asia has been doing for several decades.
 
Not calling you out specifically but this is just American exceptionalism thinking.

America didn't infect the world with iphone/ipad kids, you were followers. In Asian countries like Korea/China/Philippines it's been the norm for parents to have been letting the PCs/phones raise their kids since net cafes were invented in the 90's. Then it went to the first smart phones. Then ipads. America has nothing on the level of weaponized dissociative kid raising that Asia has been doing for several decades.
For example, the average Japanese twitter user is orders of magnitude more unhinged than the craziest American user
 
Can I sue youtube shorts any time it shows me a tranny?
I have never seen a law ever used fairly. Even obvious traffic laws are based on what the driver is driving or if the cop just doesn’t feel like chasing someone down over an expired sticker

Whatever the case is, laws are not meant to be, but will be used as another arrow in the quiver of the government to “get” people they don’t like. Who will they target first? Most likely Musk and Twitter just knowing them. With Zuckerberg being wayyyy down the list at last, if he is there at all.
 
Exactly. I'm trying to figure out why so many people here seem to think this is a win or something.

How can you have any algorithm if you are liable for the content within. The home page is an algorithm, the search results are an algorithm, ratings sort is algorithm, autoplay is an algorithm... Do you expect that Google will shutter and gimp half their products? Do you people think TikTok actually boosted 'blackout' content, or just failed to censor it? How is this anything but a ruling that undermines 230 in favor of stringent censorship/content vetting?
Seems like there are plenty of retards here who think the government is not only there to protect all the children, but has an obligation to do so at all costs.

In any case, the usability of the Internet as a whole is about to take a huge hit. I recommend being pepared by following the advice from these 2 threads:
1) https://kiwifarms.st/threads/internet-blackout-contingency-plans.199430/
2) https://kiwifarms.st/threads/offline-long-term-digital-archival.195002/
 
Exactly. I'm trying to figure out why so many people here seem to think this is a win or something.

How can you have any algorithm if you are liable for the content within. The home page is an algorithm, the search results are an algorithm, ratings sort is algorithm, autoplay is an algorithm... Do you expect that Google will shutter and gimp half their products? Do you people think TikTok actually boosted 'blackout' content, or just failed to censor it? How is this anything but a ruling that undermines 230 in favor of stringent censorship/content vetting?
Any business has a large amount of liability risk. Just because something has a risk of creating a liability doesn’t mean it can’t be profitable. There are also ways outside of Section 230 that liability can be reduced. We are liable as users of our own content, and yet we still have more obscene material on the Internet than anyone could consume in his lifetime.
 
What this will probably mean for us is that Null will have to get rid of the highlighted post feature since it is algorithmically driven, in the loosest sense, to avoid any liabilities. Featured content on the front page might have to be done away with too, since that is also technically unprotected editorializing I think, since promotion of content is also editorialization. That won't be too bad though, because the Community Driven Happening Thread is still fair game.
 
Back