Nazis and incels are using Gotye and MGMT to evade TikTok's auto-moderators, report finds

The largest independent study of hate on TikTok has found anti-Asian and pro-Nazi videos are racking up millions of views, often using pop songs to evade the platform's auto-moderators.

Key points:​

  • The TikTok "Hatescape" report was a result of three months of research
  • Researchers have found simple measures like misspelling a word was enough to hide a video from moderators
  • Many evasion techniques were built into the app
A report by the Institute of Strategic Dialogue (ISD), a London-based counter-extremism think tank, said the leading social platform was "popular with those intent on using online spaces to produce, post and promote hate and extremism".

The "Hatescape" report, the result of three months of research, found some of its leading racist or terroristic content was driven by or pertained to Australians.

One video that garnered over 2 million views involved a Caucasian man eating a bat in an offensive reference to stereotypes about Chinese people.

Other videos featuring Australians involve one man dressed up in blackface or as a police officer while "reenacting" the murder of George Floyd, whose death sparked last year's global Black Lives Matter protests.

"This kind of stuff is repeated across the board," the study's author Ciaran O'Connor said.

"The comments below then kind of confirm what themes it is hitting on and how it's generating hateful attitudes."

While it's hardly surprising fringe groups have used social media platforms to post hateful material, experts say TikTok has birthed creative ways to distribute the content and dodge auto-takedowns.

These evasions have included something as simple as changing the soundtrack — but it remains unclear what exactly works.

"TikTok is notoriously quite a difficult platform to study, for one thing it's kind of new and there isn't much of a methodology," said Ariel Bogle, an analyst at the International Cyber Policy Centre (ICPC).

"It's also driven by an algorithm, which has remained very opaque."

The battle to remove online extremism is one we're losing

A young Muslim girl with a headscarf around her head and a lollypop in her mouth
A year after the Christchurch massacre, hate crime experts say technology platforms and law enforcement agencies are failing to curtail the white supremacy ideologies that drove the killings.
Read more

Mr O'Connor said in his sample of 1,030 videos — compiled from a library of common hashtags and phrases deployed by fringe groups — there was egregious content about all manner of protected groups.

Anti-Asian videos often made prevalent use of COVID-19 or other unrelated hashtags, which Mr O'Connor said made the content more discoverable and could get it in front of people who would otherwise never see it.

Almost half of the videos in the sample were pro-Nazi materials, some showing footage from the Christchurch shooter's livestream or people reading directly from his manifesto.

This content is illegal in many jurisdictions and some posts may contain a link, directing viewers to an encrypted messaging channel with more extremist material.

TikTok was also found to be attracting other fringe types such as anti-transgender groups or the Men Go Their Own Way ideology, an offshoot of the misogynistic incel movement.

Mr O'Connoer said more than 80 per cent of the videos were still live at the conclusion of the study, but the ABC understands the platform has since erased all of them after they were flagged by ISD.

While TikTok said it removes a vast majority of content that violates its policy within 24 hours, a lot of the features used to evade its own moderators are built-in features of the platform.

The icon for TikTok app is shown on a screen.

Mr O'Connor said the platform has learnt from the mistakes of Facebook and TikTok, but more transparency was needed.(
AP
)
Both Ms Bogle and ISD pointed to simple ways users have sidestepped moderation, such as changing a letter in a banned phrase or account name or misspelling a hashtag.
Other techniques made use of the app's "Stitch" or "Duet" feature which combined banned and Nazi material with other videos, which appeared enough in some cases to avoid censorship.
"There's always a kind of cat and mouse game going on, with TikTok's moderation," Ms Bogle said.
"I've also seen figures who have their accounts removed by TikTok but have remained on the platform because other people have just re-uploaded them, potentially with a different soundtrack."
TikTok can also forcibly mute videos that contain audio that infringes on its community guidelines, but an effective workaround has been using songs from the app's catalogues.
The song of choice for fringe groups, according to Mr O'Connor, was MGMT's Little Dark Age.
Also featured prominently in ISD's sample were Gotye's Somebody That I Used to Know and Kate Bush's Running Up That Hill.
None of these songs contain any extremist connotations.

The dark side of TikTok​

Lauren looks at her phone. Behind her is a pink and blue background showing dozens of TikTok videos.
How TikTok's powerful, addictive algorithm exposes users to dangerous content, while the app faces accusations of censorship and bias.
Read more
Mr O'Connor said while TikTok has learned from the pitfalls and mistakes of other social media companies, and most of the flagged videos had low engagement, he said more transparency was needed from the company on how it moderates its content.
"There's an enforcement gap in TikTok's approach to hate and extremism … this content is removed, but inconsistently," he said.
A spokeswoman for TikTok said the social media "greatly" valued research produced by ISD and other organisations.
"TikTok categorically prohibits violent extremism and hateful behaviour, and our dedicated team will remove any such content as it violates our policies and undermines the creative and joyful experience people expect on our platform," she said in a statement.
"We greatly value our collaboration with ISD and others whose critical research on industry-wide challenges helps strengthen how we enforce our policies to keep our platform safe and welcoming."
Posted 8h ago8 hours ago, updated 3h ago
 
"The "Hatescape" report, the result of three months of research, found some of its leading racist or terroristic content was driven by or pertained to Australians.
One video that garnered over 2 million views involved a Caucasian man eating a bat in an offensive reference to stereotypes about Chinese people."

@Dyn You have some 'splainin to do.
 
"The "Hatescape" report, the result of three months of research, found some of its leading racist or terroristic content was driven by or pertained to Australians.
One video that garnered over 2 million views involved a Caucasian man eating a bat in an offensive reference to stereotypes about Chinese people."

@Dyn You have some 'splainin to do.
Honestly, I dunno why it's that offensive to say that Chinks eat bats and dogs, cause they do. Is it racist to say that French people eat horses? Cause they do. Hell, some of my countrymen eat dogs too, mostly cause they're too poor to buy real meat so they just catch stray dogs. It's fucked up, but it's cause people are poor so I can't fault them for trying to survive.

You can argue that saying that ALL Chinks eat dogs is racist, cause that's not true, but to say that some of them do is just a fact.
 
Honestly, I dunno why it's that offensive to say that Chinks eat bats and dogs, cause they do. Is it racist to say that French people eat horses? Cause they do. Hell, some of my countrymen eat dogs too, mostly cause they're too poor to buy real meat so they just catch stray dogs. It's fucked up, but it's cause people are poor so I can't fault them for trying to survive.

You can argue that saying that ALL Chinks eat dogs is racist, cause that's not true, but to say that some of them do is just a fact.
China is a funny country. The people are so segregated that they don't even know that people in the slums and countryside eat dogs. Of course, whats even more funny is western people that assume that all of China is just Hong Kong.
 
If they don’t let you know what the ‘common phrases and hashtags’ they used are, chances are this is a loaded study. For instance I wonder if they searched for ‘riots’, ‘Antifa’, ‘George Floyd’, ‘BLM’, ‘Portland’, or if they searched for ‘supremacy’, ‘nazi’ ‘white’ ‘bat’ ‘MAGA’ and ‘Trump’.
 
China unleashes a plague killing countless people... but please don't make fun of them by trying to blind yourself with dental floss, because that's just rude.

If WW3 happens, allied powers are going to be like "Um, yeah. Watching all those Chinese die in nuclear fire is cool, but don't call them chinks on social media."
 
Honestly, I dunno why it's that offensive to say that Chinks eat bats and dogs, cause they do. Is it racist to say that French people eat horses? Cause they do. Hell, some of my countrymen eat dogs too, mostly cause they're too poor to buy real meat so they just catch stray dogs. It's fucked up, but it's cause people are poor so I can't fault them for trying to survive.

You can argue that saying that ALL Chinks eat dogs is racist, cause that's not true, but to say that some of them do is just a fact.

Seriously, fuck these people.
 
Back