Experts fear the decision by Meta to drop professional factcheckers from Facebook will exacerbate so-called boomer radicalisation in the UK.
Even before what Keir Starmer described as “far-right riots” in England last summer, alarm bells were ringing amid fears older people were even more susceptible to misinformation and radicalisation than younger “digital natives”.
Suspects were generally older than those charged in the 2011 unrest, according to a Guardian analysis of hundreds of defendants that found that as many as 35% were in their 40s or older.
However, after Mark Zuckerberg announced last week that Meta would replace factcheckers with a crowdsourced system and recommend more political content, there is now new concern about the potential radicalisation risks on Facebook, the social media platform of choice for many older people.
“It’s clearly a retrograde step that comes with all sorts of risks,” said Dr Sara Wilford of De Montfort University, a lead researcher on a pioneering Europe-wide project called Smidge (Social Media Narratives: Addressing Extremism in Middle Age).
“X might be the model for the crowdsourced ‘community notes’ approach that Meta seems to be embracing, instead of professional moderators, but it just won’t work in the same way with Facebook, which very much operates in little silos or closed groups. I’m concerned that, for middle-aged Facebook users who risk being exposed to extremist content, it will be even harder to discern the truth.”
The anti-extremism campaign group Hope not Hate also told the Guardian it feared Zuckerberg’s announcement was a prelude to far-right figures and groups, such as Tommy Robinson and Britain First, being allowed back on to Facebook.
Britain First proved particularly adept at using the platform before it was banned, amassing 2m likes – at that stage surpassing Labour (1m) and the Conservatives (650,000).
In terms of perpetrators of crime, young men still account for the majority of culprits. Yet before the riots, discussion about boomer radicalisation had already been sparked by cases such as Darren Osborne, who was 48 when he was jailed in 2018 for his lethal terror attack at a mosque in north London’s Finsbury Park, having been, in the words of the judge, “rapidly radicalised” online.
Another man, Andrew Leak, was 66 when he firebombed a Dover migrant centre in 2022 in what police described as an “extreme rightwing” attack, later killing himself but leaving behind an internet history riddled with racism.
When it came to the riots, Hope not Hate said Facebook was used in a particular way by the far right, in contrast to other platforms. “Telegram was for whipping up the most extreme hate, or sometimes plotting and planning, while X was used to to disseminate that message,” said Joe Mulhall, the anti-racism campaign group’s director of research.
“Facebook was then often where you would see a group creating hyperlocal targeted content, with a page popping up around a specific event. We’ve also seen over the last three to four years that anti-migrant protest Facebook groups were really fundamental in organising the targeting of asylum centres.”
Users of such pages often skew older, in line with broader Facebook usage. An Ofcom report last year found that overall, social media users were most likely to describe Facebook as their main social media site (48%), but added: “This was driven by its significant popularity among older social media users.” It also warned that older adults were less likely to recognise a fake social media profile.
Wilford said her research suggested some older Facebook users were often particularly vulnerable, for reasons including a reluctance to factcheck and a tendency to trust online content at face value when it came packaged like conventional news output.
“We are also talking about people – an invisible generation – sometimes looking back on a life that might not have been as they wanted it to be, whether it was their job or social conditions,” she added. “But when they go online and interact with other Facebook users, they are embracing an echo chamber that makes them feel good. They’re listened to and find validation.”
The problem of misinformation seeping through to Facebook groups devoted to everyday life has drawn a response from some councils, which have devoted resources to training members of the public who moderate on local community groups.
But Britain’s seismic political events of recent years have also transformed the Facebook experience for many whose first innocent interactions may have been sharing pictures with family or posting neighbourhood news.
Brexit, Trump’s 2016 win and the Covid pandemic acted as catalysts for engagement with more extreme forms of rightwing politics via Facebook, according to Dr Natalie-Anne Hall, a lecturer at Cardiff University and author of Brexit, Facebook, and Transnational Right-Wing Populism.
“Facebook is a key site for algorithmically driven encounters with these harmful ideas within people’s everyday practices of social media use. Meta should be doing more, not less, to combat this harm,” she said.
“Zuckerberg’s comments and Meta’s new position on this issue will only serve to embolden the misplaced sense of victimhood among those with antiprogressive views that research has shown feeds into radicalisation.”
When asked to comment on the concerns about misinformation and extremism, Meta referred to a blogpost, which said its “complex systems” to manage content had “gone too far”.