UK Race riots put Britain on collision course with Elon Musk - Britain’s government has social platforms in its sights as incitement spreads — and the X owner is squaring up for a fight.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Race riots put Britain on collision course with Elon Musk
Politico EU (archive.ph)
By Esther Webber and Vincent Manacourt
2024-08-06 07:56:44GMT

uk01.jpg
Fake news channels on X helped to disseminate false information about the killing of three children in Southport last week. | Christopher Furlong/Getty Images

LONDON — Britain’s in the grip of its worst race riots in decades. And Elon Musk just can't help himself.

The billionaire X owner sparked fury in the British government this weekend after he responded to incendiary footage of the far-right disorder that's sweeping the country by saying "civil war is inevitable."

The post on X was roundly condemned by U.K. Prime Minister Keir Starmer's office, which said there was “no justification” for Musk’s comments.

But Musk doubled down on Monday night. Responding to a statement from Starmer vowing his government would “not tolerate attacks on mosques or on Muslim communities,” the X boss effectively accused the British prime minister of wearing blinkers. “Shouldn’t you be concerned about attacks on all communities?”

Starmer's top interior minister, Yvette Cooper, meanwhile has a litany of complaints over the way social media giants like X are policing incitement and disinformation on their platforms.

“There are some things which are clearly already criminal, where we'll need police intervention and action to pursue those," Cooper told the BBC Monday. "There are other areas where the social media companies do have clear requirements at the moment to remove criminal material, and should be doing so, but sometimes take too long to do so.

"There are other areas where they have made commitments around their terms and conditions that are supposed to be enforced but are not being done so."

Cooper's vowed to take up the issue with tech giants this week.

Yet, despite plenty of hand-wringing over the proliferation of far-right messaging, Britain's toolbox for forcing the hands of social media companies seems limited.

This time, the riots — which have seen mosques attacked and accommodation for asylum seekers targeted — were inextricably linked to online communications. Fake news channels on X helped to disseminate false information about the killing of three children in Southport last week.

uk02.jpg
The riots — which have seen mosques attacked and accommodation for asylum seekers targeted — were inextricably linked to online communications. | Christopher Furlong/Getty Images

Right-wing influencers with huge reach, such as English Defense League founder Tommy Robinson and actor-turned-anti-woke activist Laurence Fox, have punted messages at their thousands of followers on X, Facebook, Instagram and TikTok. (Fox approvingly shared Musk's attack on Starmer Monday night.)

WhatsApp and Telegram have been used to organize gatherings at short notice, while flyers organizing specific protests have been spread on Facebook. TikTok has been abuzz with videos of the violence.

But X in particular has proven a particular hotbed of far-right chatter. Musk's direct intervention aside, the platform has also reinstated Robinson's account. He is currently banned on Instagram and Facebook.

In a statement Monday, Britain’s Tech Secretary Peter Kyle said it is “undeniable” that social media has provided a platform for the rioters.

“We have been clear with these companies they also have a responsibility not to peddle the harm of those who seek to damage and divide our society, and we are working closely with them to ensure they meet that responsibility,” he added.

'No need to wait'
So, beyond beefing with Musk, what can Britain’s government actually do? The administration has a big legislative stick to use — but it's simply not ready yet.

Under Britain’s Online Safety Act, years in the making, platforms will have a duty to “take robust action” against illegal content. That includes content that incites violence or which is related to “racially or religiously aggravated public order offenses.”

Platforms are meant to prevent illegal content appearing on the platforms in the first place — and to act quickly to remove it if it does appear.

Failing to meet these obligations could see social media firms face fines of up to £18 million — or 10 percent of their worldwide revenue, whichever is greater — by media regulator Ofcom.

But crucially, the act's provisions on illegal content only come into effect around the end of 2024. And Britain’s existing laws on inciting violence stem from its 1986 public order act, which predates social media by decades — and so require police to comb platforms for potential breaches.

For now, British authorities can only implore tech companies to do the right thing and stringently enforce their own policies, many of which claim to ban the kind of content that has been openly rife online in recent days.

“There’s no need for online services to wait for the new laws to come into force before they make their sites and apps safer for users,” said a spokesperson for Britain's tech regulator, Ofcom.

“Our role will be to make sure that regulated services take appropriate steps to protect their users,” they added. “It will not involve us making decisions about individual posts or accounts.”

Sunder Katwala, director of the think tank British Future, told POLITICO that “will and capacity” are needed by social media platforms to remove offensive or dangerous content, “and what you've got at the moment is less will and and less capacity than you used to have, certainly in the case of X — and on Facebook and Tiktok.” X, Meta, TikTok, and Telegram were approached for comment.

uk03.jpg
Social media could have upsides in catching those breaking the law. | Christopher Furlong/Getty Images

He added that pressure from the top could be key to forcing change, since “politicians have actually got something very important on the regulators — which is that they've got a forum to which you can summon people.”

Sara Khan, who served as former Prime Minister Rishi Sunak’s adviser on social cohesion, has accused ministers of failing to heed her 2021 report co-authored with Metropolitan Police chief Mark Rowley, which warned that certain prevalent forms of hateful extremism are not captured by existing legislation.

“Our rules have failed to evolve with this growing extremist threat, there are gaps in our legislation that is allowing them to, in effect, operate with impunity," Khan told the Guardian this week.

Over in the EU, the bloc's equivalent of the Online Safety Act — the Digital Services Act — is already in force and X is facing a probe by the European Commission over the spread of toxic content on the platform.

In France, President Emmanuel Macron even floated the idea of cutting access to social media platforms altogether because of the role he said they played in exacerbating riots in the country last summer. Britain seems unlikely to go quite that far.

Action by social media giants ultimately depends on the credible threat of enforcing regulation, according to Katwala — something he believes has been sorely lacking so far. "If tech companies don’t comply when the time comes, we’ll have a broad range of enforcement powers at our disposal," said the Ofcom spokesperson.

In the meantime, social media could have upsides in catching those breaking the law. Nazir Afzal, who was chief crown prosecutor in the north west of England at the time of 2011 disorder, pointed out that videos shared online would make it far easier to identify perpetrators than it was 13 years ago, when the main resource available was CCTV.

But, as the sparring with Musk continues, Britain's government remains to be convinced. “Some of this is about criminal behavior of individuals, and some of this is about the responsibility of the social media companies,” said Cooper, the home secretary. “We need to pursue both, because we obviously cannot carry on like this.”
 
“Our role will be to make sure that regulated services take appropriate steps to protect their users,” they added
Protect users from what? Children were stabbed in their torsos, not their twitter accounts, and people scream at shitskins outside, not on the online.

Over in the EU, the bloc's equivalent of the Online Safety Act — the Digital Services Act — is already in force and X is facing a probe by the European Commission over the spread of toxic content on the platform.
"Toxic" is a muhmentalhealth term.
 
The only way to deal with the problem from the British government's view is authoritarian state style complete blackouts of social media platform. But the political cost of that is very high, since then western hypocrisy on topics as democracy and freedom of speech will be dead evident and countries like China, Russia and India can just point at that.

The general trend that diverse countries can only be ruled authoritarian still holds true. Europe is moving heavily towards police state and censorship because that's the only way to keep the diverse societies from completely imploding.
 
Fake news channels on X helped to disseminate false information about the killing of three children in Southport last week.

I know, Nothing Ever Happens, but journalists are going to regret their traitorous existence one day.

The only information that matters, the info that has the native population shit-kicking right now, is that immigrants (likely illegal/extrajudicial) from war-torn shitholes are killing and raping their women and children, and have been for a long time.

Fake news, false information, the fucking projection of these assholes. Even the pic that accompanies that caption looks like a gay-opped photo. Everyone in the background facing the same direction, with their faces visible, and then one asshole in a skull mask facing the camera like he was instructed to, waving a flare and displaying a knife (ooh, so scary) for no reason other than caricaturing a 'bad-guy'.
 
Last edited:
Gee, maybe if the government and the media hadn't concealed the identity of the (non-Muslim) perpetrator because he wasn't QUITE yet 18 (but then released it when it suited their narrative because he was ALMOST 18 a few days later) then perhaps this could have been avoided?

But also perhaps there's a reason that Britons immediately assume that any horrific crime of this sort is due to Muslims? I wonder, why could that be?

Rule Brittania, indeed.
 
Back