Opinion Deplatforming hate forums doesn't work, British boffins warn - Industry intervention alone can't deal with harassment


Depriving online hate groups of network services - otherwise known as deplatforming - doesn't work very well, according to boffins based in the United Kingdom.

In a recently released preprint paper, Anh Vu, Alice Hutchings, and Ross Anderson, from the University of Cambridge and the University of Edinburgh, examine efforts to disrupt harassment forum Kiwi Farms and find that community and industry interventions have been largely ineffective.

Their study, undertaken as lawmakers around the world are considering policies that aspire to moderate unlawful or undesirable online behavior, reveals that deplatforming has only a modest impact and those running harmful sites remain free to carry on harassing people through other services.

"Deplatforming users may reduce activity and toxicity levels of relevant actors on Twitter and Reddit, limit the spread of conspiratorial disinformation on Facebook, and minimize disinformation and extreme speech on YouTube," they write in their paper. "But deplatforming has often made hate groups and individuals even more extreme, toxic and radicalized."

As examples, they cite how Reddit's ban of r/incels in November 2017 led to the creation of two incel domains, which then grew rapidly. They also point to how users banned from Twitter and Reddit "exhibit higher levels of toxicity when migrating to Gab," among other similar situations.

The researchers focus on the deplatforming of Kiwi Farms, an online forum where users participate in efforts to harass prominent online figures. One such person was a Canadian transgender streamer known as @Keffals on Twitter and Twitch.

In early August last year, a Kiwi Farms forum member allegedly sent a malicious warning to police in London, Ontario, claiming that @Keffals had committed murder and was planning further violence, which resulted in her being "swatted - a form of attack that has proved lethal in some cases.

Following further doxxing, threats, and harassment, @Keffals organized a successful campaign to pressure Cloudflare to stop providing Kiwi Farms with reverse proxy security protection, which helped the forum defend against denial-of-service attacks.

The research paper outlines the various interventions taken by internet companies against Kiwi Farms. After Cloudflare dropped Kiwi Farms on September 3 last year, DDoS-Guard did so two days later. The following day, the Internet Archive and hCaptcha severed ties.

On September 10, the kiwifarms.is domain stopped working. Five days later, security firm DiamWall suspended service for those operating the site.

On September 18, all the domains used by the forum became inaccessible, possibly related to an alleged data breach. But then, as the researchers observe, the Kiwi Farms dark web forum was back by September 29. There were further intermittent outages on October 9 and October 22, but since then Kiwi Farms has been active, apart from brief service interruptions.

"The disruption was more effective than previous DDoS attacks on the forum, as observed from our datasets. Yet the impact, although considerable, was short-lived." the researchers state.

"While part of the activity was shifted to Telegram, half of the core members returned quickly after the forum recovered. And while most casual users were shaken off, others turned up to replace them. Cutting forum activity and users by half might be a success if the goal of the campaign is just to hurt the forum, but if the objective was to 'drop the forum,' it has failed."

Hate is difficult to shift

One reason for the durability of such sites, the authors suggest, is that activists get bored and move on, while trolls are motivated to endure and survive. They argue that deplatforming doesn't look like a long-term solution because, while casual harassment forum participants may scatter, core members become more determined and can recruit replacements through the publicity arising from censorship.

Vu, Hutchings, and Anderson argue that deplatforming by itself is insufficient and needs to be done in the context of a legal regime that can enforce compliance. Unfortunately, they note, this framework doesn't currently exist.

"We believe the harms and threats associated with online hate communities may justify action despite the right to free speech," the authors conclude. "But within the framework of the EU and the Council of Europe which is based on the European Convention on Human Rights, such action will have to be justified as proportionate, necessary and in accordance with the law."

They also contend that police work needs to be paired with social work, specifically education and psycho-social support, to deprogram hate among participants in such forums.

"There are multiple research programs and field experiments on effective ways to detox young men from misogynistic attitudes, whether in youth clubs and other small groups, at the scale of schools, or even by gamifying the identification of propaganda that promotes hate," they argue. "But most countries still lack a unifying strategy for violence reduction." ®
 
I think we all need to step back and realize this isn’t a government gameplan or anything, it’s a thesis from three uninformed over-educated hacks at best, that like that one paper on impregnating brain-dead women, is more of a shock value “We can do this, but know no one will.”

Some of you are getting MATI over the Academia equivalent of Attention Whoring.
I mean you are right that this is just a bunch of intelligentsia being retarded (Lenin was right about them NGL), but that doesn't mean that this might not quickly become more malicious. It gets published, starts circulation around retarded social media circles, and then breaks into the mainstream by a couple more articles like this. Next thing you know, some politicians are going to be pushing for this.

"Mainstream Academia" is always ahead of the political positions being pushed openly by TPTB. Give it time, and it will likely shoot far and away out of control.
 
We, no, ANYONE could have told you that! When you target people for their opinions, the very BEST case is that they get more entrenched and angry.. When you try to silence people you run the very real risk of them beginning to see speech as irrelevant. On a large scale, especially one including nearly half the population, attempts to silence are an invitation to much MUCH worse than people simply saying mean things online! This shit is not just dangerous in the sense that it's a violation of free speech and association, but also in the sense that it (rightfully) makes people take an even harder line and feel victimized. This insane and fallacious theory from social media that "de-platforming works" needs to be dispelled ASAP. Sadly it's become a major talking point with reddit types. Even hint of push back against the idea will get you a storm, a tantrum, of REEEEEing and doubling down on it's usefulness.
 
Lol. Perhaps even lmao. I might even offer lmfao.

This entire thing reads like a bad, late April Fool’s joke from someone who finally got around to watching A Clockwork Orange.

Where I am certain there are people who post on this forum possessing some level of real vitriol, I would imagine most participate to voice their opinions in an avenue where they are least likely to be condemned for it. I’ve met several great people during my time as a registered user here. Nothing too personal, but I can tell that these people separate their KF user identities from their actual person, whereas people such as this author are terminally online and insufferable regardless of the situation or space.

This forum gets a really negative reputation for the stupidest things, like the use of the gamer word, or for exposing people like keffals for being a sex pest. It is an absurdist mentality to think that someone requires a lobotomy because they think differently than you, or don’t like the people that you champion. Don’t ever meet your heroes. Better yet, stop breathing entirely. You share the same oxygen as those pesky Kiwi Farms posters. This site is both noxious and infectious, you might turn into one of us otherwise.

I will continue to call creatures such as journalists and Keffals faggots. I will continue to comment on their stupidity. The only “reprogramming” of me that will happen is when I tire of the site and want to take a few months away from it, but rest assured that I will be back and more ready than ever to call people faggots.

Faggot
 
Calling the Tor version of the site a "dark web forum" is so fucking cringe. Actually, the term "dark web" is just the faggiest shit. It's funny how they can just openly say "We want felonies committed against this website because they say mean words" and it's just fine. I shouldn't be surprised at this point at how disingenuous some Br*t*sh journoscum is but they really keep breaking the glass ceiling when it comes to blatantly lying.
 
It’s British tabloid-speak for “scientist”. I’ve literally never seen it used outside of The Sun newspaper before.

See also: tot (a small child).
the Toys for Tots charity makes it somewhat famous in the US, but nobody actually calls children "tots" outside of that, or a few classic movies.
 
Here's a copy of the paper. I skimmed it and it's about what you'd think.

This type of midwit take makes my blood boil:
View attachment 5065934
I like these people say this and then complain when Elon Musk doesn't do what they want on twitter. They use the Libertarian "it's just private companies bro" argument only when it's a group they dislike. When it's a group they like which gets deplatformed suddenly they need the government to get involved to regulate Twitter.
 
Back