The Rise, Fall, and Return of Kiwi Farms After Deplatforming Efforts - Pseudo intellectual article attempts to frame the farms deplatforming as authoritatively retarded as possible

Authors:
(1) Anh V. Vu, University of Cambridge, Cambridge Cybercrime Centre (anh.vu@cl.cam.ac.uk);
(2) Alice Hutchings, University of Cambridge, Cambridge Cybercrime Centre (alice.hutchings@cl.cam.ac.uk);
(3) Ross Anderson, University of Cambridge, and University of Edinburgh (ross.anderson@cl.cam.ac.uk).

2.1. Related Work

Most studies assessing the impact of deplatforming have worked with data on social networks. Deplatforming users may reduce activity and toxicity levels of relevant actors on Twitter [28] and Reddit [29], [30], limit the spread of conspiratorial disinformation on Facebook [31], reduce the engagement of peripheral members with hateful content [44], and minimise disinformation and extreme speech on YouTube [32]. But deplatforming has often made hate groups and individuals even more extreme, toxic and radicalised. They may view the disruption of their platform as an attack on their shared beliefs and values, and move to even more toxic places to continue spreading their message. There are many examples: the Reddit ban of r/incels in November 2017 led to the emergence of two standalone forums, incels.is and incels.net, which then grew rapidly; users banned from Twitter and Reddit exhibit higher levels of toxicity when migrating to Gab [33]; users migrated to their own standalone websites after getting banned from r/The Donald expressed higher levels of toxicity and radicalisation, even though their posting activity on the new platform decreased [45], [46]; the ‘Great Deplatforming’ directed users to other less regulated, more extreme platforms [47]; the activity of many right-wing users moved to Telegram increased multi-fold after being banned on major social media [34]; users banned from Twitter are more active on Gettr [48]; communities migrated to Voat from Reddit can be more resilient [49]; and roughly half of QAnon users moved to Poal after the Voat shutdown [50]. Blocking can also be ineffective for technical and implementation reasons: removing Facebook content after a delay appears to have been ineffective and had limited impact due to the short cycle of users’ engagement [51].

The major limitation of focusing on social networks is that these platforms are often under the control of a single tech company and thus content can be permanently removed without effective backup and recovery. We instead examine deplatforming a standalone website involving a concerted effort on a much wider scale by a series of tech companies, including some big entities that handle a large amount of Internet traffic. Such standalone communities, for instance, websites and forums, may be more resilient as the admin has control of all the content, facilitating easy backups and restores. While existing studies measure changes in posting activity and the behaviours of actors when their place is disrupted, we also provide insights about other stakeholders such as the forum operators, the community leading the campaign, and the tech firms that attempted the takedown.

Previous work has documented the impacts of law enforcement and industry interventions on online cybercrime marketplaces [20], cryptocurrency market price [52], DDoSfor-hire services [14], [15], the Kelihos, Zeus, and Nitol botnets [53], and the well-known click fraud network ZeroAccess [54]; yet how effective a concerted effort of several tech firms can be in deplatforming an extreme and radicalised community remains unstudied.

2.2. The Kiwi Farms Disruption
KIWI FARMS had been growing steadily over a decade (see Figure 1) and had been under Cloudflare’s DDoS protection for some years.[2] An increase of roughly 50% in forum activity happened during the COVID-19 lockdown starting in March 2020, presumably as people were spending more time online. Prior interventions have resulted in the forum getting banned from Google Adsense, and from Mastercard, Visa and PayPal in 2016; from hundreds of VPS providers between 2014–2019 [55]; and from selling merchandise on the print-on-demand marketplace Redbubble in 2016. XenForo, a close-source forum platform, revoked its license in late 2021 [56]. DreamHost stopped its domain registration in July 2021 after a software developer killed himself after being harassed by the site’s users. This did not disrupt the forum as it was given 14 days to seek another registrar [57]. While these interventions may have had negative effects on its profit and loss account, they did not impact its activity overall. The only significant disruption in the forum’s history was between 22 January and 9 February 2017 (19 days), when the forum’s owner suspended it himself due to his family being harassed [58].[3]

The disruption studied in this work was started by the online community in 2022. A malicious alarm was sent to the police in London, Ontario by a forum member on 5 August 2022, claiming that a Canadian trans activist had committed murders and was planning more, leading to her being swatted [23]. She and her family were then repeatedly tracked, doxxed, threatened, and generally harassed. In return, she launched a campaign on Twitter on 22 August 2022 under the hashtag #dropkiwifarms and planned a protest outside Cloudflare’s headquarters to pressure the company to deplatform the site [59]. This campaign generated lots of attention and mainstream headlines, which ultimately resulted in several tech firms trying to shut down the forum. This is the first time that the forum was completely inaccessible for an extended period due to an external action, with no activity on any online places including the dark web. It attempted to recover twice, but even when it eventually returned online, the overall activity was roughly halved.

The majority of actions taken to disrupt the forum occurred within the first two months of the campaign. Most of them were widely covered in the media and can be checked against public statements made by the industry and the forum admins’ announcements (see Figure 2). The forum came under a large DDoS attack on 23 August 2022, one day after the campaign started. It was then unavailable from 27 to 28 August 2022 due to ISP blackholing. Cloudflare terminated their DDoS prevention service on 3 September 2022 – just 12 days after the Twitter campaign started – due to an “unprecedented emergency and immediate threat to human life” [24]. The forum was still supported by DDoSGuard (a Russian competitor to Cloudflare), but that firm also suspended service on 5 September 2022 [25]. The forum was still active on the dark web but this .onion site soon became inaccessible too. On 6 September 2022, hCaptcha dropped support; the forum was removed from the Internet Archive on the same day [60]. This left it under DiamWall’s DDoS protection and hosted on VanwaTech – a hosting provider describing themselves as neutral and noncensored [61]. On 15 September 2022, DiamWall terminated their protection [26] and the ‘.top’ domain provider also stopped support [27]. The forum was completely down from 19 to 26 September 2022 and from 23 to 29 October 2022. From 23 October 2022 onwards, several ISPs intermittently rejected announcements or blackholed routes to the forum due to violations of their acceptable use policy, including Voxility and Tier-1 providers such as Lumen, Arelion, GTT and Zayo. This is remarkable as there are only about 15 Tier-1 ISPs in the world. The forum admin devoted extensive effort to maintaining the infrastructure, fixing bugs, and providing guidance to users in response to password breaches. Eventually, by routing through other ISPs, KIWI FARMS was able to get back online on the clearnet and remain stable, particularly following its second recovery in October 2022.

(Link/Archive)

----------------------------------------------------------------------------------------------------

Formatting was fucked from the website so I'm not transferring any of the images. If this has already been posted then help me figure out how I can't find it on the site.

Edit: Sorry if it wasn't clear. This is not the entire article. I just posted over the part where they talk the most about the farms disruption. Check out archive link to see the full autistic hate of the farms on display!
 
Last edited:
The moral of the study is that you need to kill or arrest website administrators. The entire thesis is that industry censorship doesn't work and you need to go after the minds behind it. This is what they're laying groundwork for.
This really isn't a thesis in that in countries which do not have section 230 this is what actually happens and has been documented for ages as having been the case. They may put it as a thesis simply because they don't know what they're talking about nor understand the entire landscape of the internet.

As a website owner or Facebook group owner in other countries without section 230 you are very often legally responsible for what your users write - what the Cambridge researchers do not realize because they are retarded is that the "rest of the world" that isn't America (the other 95%) are the reason that things like Bitcoin, Tor, Bittorrent and many other internet censorship resistant tools as well as anonymity tools exist in the first place.

You're correct in that they're probably trying to lay the groundwork for that to maybe happen legally within the US but when it comes to the technological race they already lost it, if not when Tor and other tools were launched then certainly when the US government removed Tornado cash from the OFAC list: https://home.treasury.gov/news/press-releases/sb0057 (which means they realized that they cannot sanction a piece of code running on a decentralized network - and the implications of that removal go towards the DPRK being able to finance building nuclear arms and not toward a forum that was built upon the basis of documenting Christine Westside Chandler).

People have tried and failed to stop Tor, Bitcoin and Bittorrent and none of them have stopped at this point. The technological arms race has already established all of those permanently in our lives and there are many other similar projects that are progressing slowly with or without the input of people from Cambridge who do not know what they're talking about. The arms race that they are trying to act against was started decades ago, far before Chris Chan was even born.
 
DreamHost stopped its domain registration in July 2021 after a software developer killed himself after being harassed by the site’s users.
Ah yes, the good ole' "use a person's death to preach my view of the world even though both of those are uncorrelated and can be easily fact checked".

the ‘Great Deplatforming’ directed users to other less regulated, more extreme platforms [47]; the activity of many right-wing users moved to Telegram increased multi-fold after being banned on major social media [34]; users banned from Twitter are more active on Gettr [48]; communities migrated to Voat from Reddit can be more resilient [49]; and roughly half of QAnon users moved to Poal after the Voat shutdown [50]. Blocking can also be ineffective for technical and implementation reasons: removing Facebook content after a delay appears to have been ineffective and had limited impact due to the short cycle of users’ engagement [51].
I absolutely love the wording here because it's clearly saying that deplatforming is good and how it should be made even better just because people want to express their opinions online.
People wonder why there's censorship and why there's an underground that causes it, it's because these freaks are pushing their ideals onto others.
 
Lefty papers harping about disinformation, while openly lying about the sites they discredit is one hell of a statement.
They are literally putting their delusions into scientific papers without quoting a single fucking source.

"This is peer-reviewed, so it must be true!"
I seem to remember that a Foodist Irish kid swatted Keffals, not the Kiwis. Didn't Jesse Singal do a podcast on it? The academics need to get their facts straight and quit vague terms such as: harass, bully, silence etc. chucking in a quantified graph does not turn an opinion piece into a study.
 
Ross Anderson, University of Cambridge, and University of Edinburgh
As another user mentioned Ross is dead. If this is really him, he has a wikipedia article.
1754669249561.webp


Additionally, here's a voice clip of him talking about malware:
 
Interesting time for this to come out.

The hugely unpopular UK Online Safety Act is taking heat, even among troons cos muh pedo rape games ban!
And Keffals, who this bullshit case *still* hinges off is totally, on all fronts recognised as a grifter and liar.
Terrible time to rellalease this absolutely wacky, uninsightful error ridden shitpiece of academic work.
Low and a bit embarrassing for Cambridge uni to be honest.
 
"We should arrest and kill people who disagree with us if we can't silence them!"

Academics claim to be smarter than the rest of us, yet open that door not realizing how quickly it can turn on them.

Don't expect sympathy from me when someone more unhinged than I reads this kind of shit and takes it to its natural conclusion in turn. You haughty midwit faggots earned it.
 
View attachment 7751353View attachment 7751352View attachment 7751354
I have seen that woman somewhere before. I am not absolutely sure where from but I think she is one of those WEF "stakeholder" censor you for your own good types. I also grabbed the .pdf if anyone wants to read this shit in a better format.
Maybe from the eWhore video on Youtube? That woman will insert herself into any edrama and moral panic she can. The United Kingdom needs to be bombed. It's a failed country filled with failed people.
The academics need to get their facts straight
:story:Universities don't care of you cite Wikipedia today. Asking them for the facts is pointless. They should have their grants taken from them until they go back to being unbiased.
 
  • Like
Reactions: Ghostse
Back