Platforms are rewarding inflammatory posts
Platforms like X thrive in crises like these, by design. The more engagement a post receives, the more likes, the more shares and the more clicks, the more likely it is to spread. That means inflammatory and misleading content across high-use platforms spreads fast. Unfortunately, the main moderation mechanism used by X – Community Notes – is proving too slow to keep up.
Community Notes is a feature on the platform which allows users to collaboratively add context and corrections to potentially misleading posts. The model uses a “community-driven” approach to combat misinformation, without funding professional moderation teams or fact checkers.
Demos’s analysis of Community Notes created on X during the Southport riots found that less than 5% of Community Notes proposed were actually published. Of the Notes that did go live, the time it took to get them up failed to mitigate against people seeing, and acting on, false and harmful content. On the day the riots broke out, the delay to Community Note publication stretched to almost 20 hours. By then, the damage was done: some posts explicitly threatening marginalised communities and promising violent action had racked up more than 67 million views.
Our research also looked at posts published over the course of the riots, finding more than one in five posts were directly threatening to racialised groups. One post that falsely claimed the attacker was Muslim received 1.5 million views. Another alleging he was an “illegal immigrant” reached nearly 1.3 million. More than half of these harmful posts targeted migrants, 36% focused on Muslims and a third were explicitly xenophobic or racist.
But racism isn’t just online – it’s political
It is true that social media sparked the fires behind Southport and Epping. But we also have to address the broader context. Over the past decade, anti-migrant and Islamophobic rhetoric has become increasingly normalised in British politics. Brexit saw a spike in racist rhetoric particularly against Muslims, following an explicitly anti-migrant Leave campaign. In 2020, Islamophobia made up 50% of official figures on religious hate crimes and research shows a clear link between political rhetoric and increasing instances of racism and Islamophobia.
Advertising helps fund Big Issue’s mission to end poverty
Even mainstream parties have pandered to this approach. Though now apologised for, the prime ministers’ ‘island of strangers’ speech – criticised for echoing the rhetoric of Enoch Powell – comes to mind. When political leaders echo racist narratives, it legitimises hate and opens the floodgates for more extreme voices online. It is easy to pretend that racism is confined to anonymous accounts, but in volatile times, all those involved in shaping public debate – from elected politicians to social media giants – have a role in ensuring the information we access is reliable and safe.
Where do we go from here?
Social media now plays an unprecedented role in societal stability, and regulation must be updated to keep up with the fact that, according to Ofcom, half of UK adults use such platforms as a news source.
That is why Demos and Full Fact are calling for platforms to assess and tackle the systemic risks posed by harmful mis/disinformation to public security. First, through the introduction of a ‘Fast Notes’ system that would allow high-risk content to be flagged and responded to quickly, rather than waiting for slow consensus. And secondly, through urgently strengthening removal systems for racial and religious hatred.
Recent disorder in Epping shows how urgent the need for effective platform moderation is, particularly as the Community Notes model sees increasing adoption by other social media giants, now including YouTube, Meta and TikTok in the US.
We cannot compromise on anti-racism, whether in the real or digital world. This push has to go hand in hand with greater work on anti-racism that must start at the political level. For example, the newly formed Community Cohesion Commission, announced earlier this summer, could be a turning point. But it must act quickly and transparently, sharing regular updates of their work and progress. Tackling division means addressing racism head-on both in our communities and in the political narratives that sustain it.
We must be clear – online harm is a public safety issue. And unless we take real action across tech and politics, we risk seeing more riots, more division, and more citizens’ safety put at risk.
Advertising helps fund Big Issue’s mission to end poverty
Do you have a story to tell or opinions to share about this? Get in touch and tell us more.
It’s helping people with disabilities.
It’s creating safer living conditions for renters.
It’s getting answers for the most vulnerable.
Big Issue brings you trustworthy journalism that drives real change.
If this article gave you something to think about, help us keep doing this work from £5 a month.
Advertising helps fund Big Issue’s mission to end poverty