LONDON (Reuters) - The British government said on Monday officials were examining the role foreign states had played in amplifying disinformation online which had helped fuel violent protests, while warning social media firms they had to do more to stop it.
Trouble flared last week following the murder of three girls at a Taylor Swift-themed event in Southport, a seaside town in northern England, after false messaging on social media wrongly identified the suspected killer as an Islamist migrant.
Protests by anti-Islam and anti-immigration groups then spread to other towns and cities across Britain, with mosques and hotels housing migrants targeted, leading to violent clashes with police.
Jacob Davey, Director of Policy and Research at the Institute of Strategic Dialogue (ISD), said the flood of online disinformation and the role of social media firms themselves had been key.
"I don't think we can underestimate how central the spread of this information is to the horrific events of the weekend," he told Reuters.
In response, the government, which has for years accused countries such as Russia of seeking to sow discord, has said it was looking to see how much impact foreign states had had in promoting the false messages.
"We have seen bot activity online, much of which may well be amplified or have the involvement of state actors, amplifying some of the disinformation and misinformation that we've seen," a spokesperson for Prime Minister Keir Starmer told reporters.
"It is clearly something that is being looked at."
Elon Musk, the owner of X, has also weighed in. Responding to a post on X that blamed mass migration and open borders for the disorder in Britain, he wrote: "Civil war is inevitable."
Davey said disinformation was spread not just by those seeking to cause trouble but by the social media platforms themselves because of their business models' algorithms which are set up to amplify a narrative online.
"You saw that in the trending in the UK topics, you saw that disinformation cropping up under searches for Southport ... That business model side of things is really important."
Disinformation was also pushed by high profile anti-immigrant activists. Stephen Yaxley-Lennon, known by the pseudonym Tommy Robinson and previously the leader of the defunct anti-Islam English Defence League, has been blamed by media for spreading misinformation on X.
He had been banned from the platform in 2018, over the production of hateful content according to media reports at the time, but was reinstated after it was bought by Musk.
PLATFORM FOR HATE
Britain brought in a new Online Safety Act last year to tackle such issues as child sexual abuse and promoting suicide, but Professor Matthew Feldman, a specialist on right-wing extremism at the University of York, said it may not help in this situation.
It did not appear to cover "online incitement to offline criminality or disorder", he said.
Feldman said far-right groups were less organised than they had been more than a decade ago when the likes of the British National Party could boast thousands of members. Now no organisation had more than a few hundred.
Despite that, they are highly visible. Modern technology was being exploited by extremists and influencers to capture attention, he said.
Nor had the trouble come out of nowhere, said the ISD's Davey: there had been unrest outside migrant centres, disorder at last year's Remembrance Day events, and thousands of people out in central London a few weeks ago to support Yaxley-Lennon.
"I think that this is really the accumulation of a much longer process whereby we've seen extremist groups become more confident," he said.
(Reporting by Michael Holden; editing by Giles Elgood)