Skip to content

Elon Musk says ‘civil war is inevitable’ as UK rocked by far-right riots. He’s part of the problem

Social media has played a big role in fueling the anti-immigration riots engulfing towns and cities in the United Kingdom.

Police hold back rioters near a burning police vehicle after disorder broke out on July 30, 2024,...
Police hold back rioters near a burning police vehicle after disorder broke out on July 30, 2024, in Southport, England.

Elon Musk says ‘civil war is inevitable’ as UK rocked by far-right riots. He’s part of the problem

And agitator-in-chief Elon Musk is not sitting on the sidelines.

The Tesla chief executive and owner of X posted to the platform Sunday that “civil war is inevitable” in response to a post blaming the violent demonstrations on the effects of “mass migration and open borders.”

On Monday, a spokesperson for the UK prime minister addressed Musk’s comment, telling reporters “there’s no justification for that.”

Musk’s decision to amplify the anti-immigrant rhetoric highlights the role that false information spread online is playing in fomenting real-world violence— an issue of growing concern to the UK government, which vowed Tuesday to bring those responsible for the riots, as well as their online cheerleaders, to justice.

In recent days, rioters have damaged public buildings, set cars on fire and hurled bricks at police officers. They also set ablaze two Holiday Inn hotels in northern and central England believed to be housing asylum seekers awaiting a decision on their claims. Hundreds have been arrested.

The riots broke out last week after far-right groups claimed on social media that the mancharged with carrying out a horrific stabbing attack that left three children dead was a Muslim asylum seeker. The online disinformation campaign stoked outrage directed at immigrants.

The suspect, who has since been named as 17-year-old Axel Rudakubana, was born in the UK, according to police.

But false claims about the attack — Britain’sworst mass stabbing targeting children in decades and possibly ever — spread rapidly online and continued garnering views even after the police had set the record straight.

According to the Institute for Strategic Dialogue, a think tank, by mid-afternoon on July 30, the day after the attack, the false name had received more than 30,000 mentions on X alone from more than 18,000 unique accounts.

“The false name attributed to the attacker was circulated organically but also recommended to users by platform algorithms,” the ISD said in a statement.

“Platforms therefore amplified misinformation to users who may not otherwise have been exposed, even after the police had confirmed the name was false.”

According to the UK government, bots, which it saidcould be linked to state-backed actors, may well have amplified the spread of false information.

Tackling ‘online criminality’

Although social media companies have their own internal policies barring hate speech and incitement to violence from their platforms, they have long struggled to implement them.

“The problem has always been enforcement,” Isabelle Frances-Wright, a technology expertat the IDS, told CNN. “Particularly in times of crisis and conflict, when there is a huge groundswell of content, at which point their already fragile content moderation systems seem to fall apart.”

It does not help matters that Musk himself has promoted incendiary content on X, a platform that European regulators last month accused of misleading and deceivingusers. If he can do it, why not others?

For example, shortly after the October 7 Hamas attack on Israel and the ensuing outbreak of the war in Gaza, the self-declared “free speech absolutist”publicly endorsed an antisemitic conspiracy theory popular among White supremacists. Musklater apologized for what he called his “dumbest” ever social media post.

On his watch, X has alsorelaxed its content moderation policies and reinstated several previously blocked accounts. That includes far-right figureheads like Tommy Robinson, who has published a stream of posts stoking the UK protests, while criticizing violent attacks.

The UK government this week vowed to prosecute “online criminality” and has pushed social media companies to take action against the spread of false information.

“Social media has put rocket boosters under... not just the misinformation but the encouragement of violence,” UK Home Secretary Yvette Cooper said Monday.

“That is a total disgrace and we cannot carry on like this,” she told BBC Radio 5 Live during an interview, adding that the police will be pursing “online criminality” as well as “offline criminality.”

During a cabinet meeting Tuesday, UK Prime Minister Keir Starmer said those involved in the riots — in person and online — “will feel the full force of the law and be subject to swift justice,” according to a readout seen by CNN.

At the samemeeting, Peter Kyle, the minister for science and technology, said that in conversations with social media companies he had made clear their responsibility to help “stop the spread of hateful disinformation and incitement.”

X, Facebook owner Meta and TikTok have not responded to CNN’s requests for comment.

It is unclear that the UK government has the tools to hold social media platforms accountable for their role in the riots.

The UK’s Online Safety Act, adopted last year, creates new duties for social media platforms, including an obligation to take down illegal content when it appears.

It also makes it a criminal offense to post false information online “intended to cause non-trivial harm.”

But the legislation is not yet in effectbecause the regulator in charge of upholding it, Ofcom, is still consulting on codes of practice and guidance.

In a statement Monday, Ofcom said tackling illegal content online is a “major priority.” The watchdog expects the first set of duties under the new Act, regarding illegal content, to go into effect “from around the end of this year.” Once the law is in place, Ofcom will be able to fine companies up to 10% of their global revenue.

“As part of our wider engagement with tech platforms, we are already working to understand what actions they are taking in preparation for these new rules,” Ofcom added.

Rob Picheta and Lauren Kent contributed reporting.

In the realm of tech businesses, Elon Musk's X platform has faced criticism for allegedly misleading and deceiving its users, as outlined by European regulators last month. Furthermore, Musk's decision to amplify controversial content on X raises questions about the influence of tech moguls on online discourse.

Read also:

Comments

Latest

Anti-Discrimination Commissioner against blanket national mention of suspects

Anti-discrimination officer against blanket nationality of suspects

Anti-discrimination officer against blanket nationality of suspects The Independent Federal Anti-Discrimination Commissioner, Ferda Ataman, has opposed calls for the routine disclosure of the nationality of suspects in criminal cases. "This can lead to distorted images" and reinforce racist stereotypes of "criminal foreigners," Ataman warned in the

Members Public