How did a single social media post, removed within hours for being false, manage to gain millions of views and be cited as credible evidence in relation to the Southport attack?
On July 29, a tragic incident involving fatal stabbings at a children’s dance class set off widespread riots in England and Northern Ireland. This unrest was significantly fueled by misinformation circulating on social media, specifically rumors alleging that an illegal migrant was the suspect.
While responsibility for the resulting violence cannot be pinned on any one individual or post, the BBC has previously highlighted a pattern in which influential social media figures amplify messages that incite protests.
In the immediate aftermath of the Southport attack, various posts began spreading false claims from a range of sources, including self-described news accounts. This misinformation quickly coalesced. By the evening of July 30, some of these incorrect assertions had gained traction via prominent online personalities like Andrew Tate, who repeated these narratives to millions of viewers on X.
However, one LinkedIn post stands out as particularly impactful in perpetuating the false narrative that the dance class suspect was a migrant. An analysis by the BBC reveals that this post, authored by a local resident, greatly contributed to the misinformation.
Eddie Murray, a man residing near Southport, made the post approximately three hours post-attack, claiming that a migrant was responsible for the violence: “My two youngest children went to holiday club this morning in Southport for a day of fun only for a migrant to enter and murder/fatally wound multiple children. My kids are fine. They are shocked and in hysterics, but they are safe. My thoughts are with the other 30 kids and families that are suffering right now. If there’s any time to close the borders completely it’s right now! Enough is enough.”
Mr. Murray’s post implied his family had been at the scene of the attack; however, BBC investigations later revealed that while they were in the vicinity, they had actually been turned away from the dance class due to capacity constraints. Murray later stated that he simply shared information he believed to be correct.
The rapid spread of this claim illustrates how unverified information can proliferate quickly and without regard for accuracy. Shortly after the stabbings, Merseyside Police offered limited details about the suspect, which is standard procedure especially when the individual is underage. A brief police communique mentioned that “armed police have detained a male and seized a knife.”
Despite the ambiguity, the rampant speculation on social media surged. Murray’s post reached only a few hundred views before its removal by LinkedIn for not complying with policies on “harmful or false content.” However, by that time, it had already been duplicated elsewhere and amassed over two million views, as reported by BBC Verify.
Within an hour of Murray’s original post, a screenshot was shared by an account demanding mass deportations, racking up over 130,000 views. Shortly thereafter, an Indian news website, Upuknews, retweeted the post and labeled it “confirmed,” further enhancing its visibility to over half a million viewers.
As speculation intensified about the suspect’s identity, others, including far-right activists, seized upon the rumor. Paul Golding, co-leader of the far-right group Britain First, claimed evidence was mounting that the Southport attack was perpetrated by a migrant, leading to considerable exposure for the misinformation.
The cycle of misinformation continued to escalate, with various individuals claiming to “confirm” the accuracy of Murray’s post—this included accusations that the suspect’s name was “Ali-Al-Shakati,” a false identity circulated despite police clarifications.
Notably, the aftermath of these events catalyzed government scrutiny over how misrepresented information on social media can lead to unrest. Jonathan Hall KC, the government’s independent reviewer of terrorism legislation, stated that the current legal framework might encourage the spread of online disinformation.
The media regulator Ofcom found a direct link between social media posts and the violent incidents in England and Northern Ireland post-stabbings, asserting that misleading content spread rapidly following the Southport attack. The government is working swiftly to enact the Online Safety Act, which seeks to compel social media platforms to erase illegal content and prevent the spread of false narratives.
As of now, the connection between misinformation and real-world consequences remains a pressing concern, while the repercussions of the Southport riot highlight the urgent need for responsible information-sharing practices in our digital age.