Technology used to counter violent messages online from Islamic State and the far right is being adapted to counter the spread of “anti-vax” conspiracy theories.
Moonshot CVE, a company currently working in as many as 28 countries, uses techniques to identify and intervene in the cases of internet users at risk of being radicalised online. Its technology has already been deployed to counter the KKK in the US, Isis and the far right in Europe.
Moonshot’s “redirect method”, which involves the use of online advertising targeted at Google and social media users searching for certain extremism-linked keywords, is now being turned to the problem of “vaccine hesitancy”, identified by the World Health Organization as one of the 10 greatest threats to global health this year.
Moonshot’s offices, full of spyholes and reinforced doors, sits behind a nondescript door in the East End of London. Data provided by the company to the Guardian on online searches performed in London for far-right memes and references over a recent four-month period shed light on the effectiveness of the tools being deployed.
They showed that there had been more than 557 searches for the keywords “kill blacks” and a further 126 for “killing blacks”. Other searches included 178 using the key words “14 words” (a reference to a slogan celebrated in far-right circles) and 56 searches for a PDF of the Turner Diaries, a racist and antisemitic novel that is credited with partly inspiring the Oklahoma City bomber and US terrorist Timothy McVeigh.
Vidhya Ramalingam, a specialist on far-right extremism who co-founded the company with Ross Frenett when they met at a counter-extremism thinktank, said employees ranged from software developers and coders to counter-terrorism experts, social workers and mental health professionals.
“We all challenge one another on a daily basis to ensure our methods are ethical, effective and built on evidence from other sectors, while pushing boundaries in our own,” she added.
Typically, the model being developed by Moonshot involves those at risk of being drawn into violent extremism being identified and “signposted” elsewhere, such as to counselling, job opportunities or counter narratives to extremist material.
While its founders expect a focus on extremism to always be at its heart, most recently it has been examining how to adapt the model to respond to what it described as “other destructive communities online” such as those spreading anti-vaccination theories, those involved in human trafficking, and people who are vulnerable to gang violence.
“The internet can be used to spread dangerous behaviours and ideas, but there is an opportunity for us to get creative and use technology to solve some of the world’s most complex problems,” Ramalingam said.
The company also works with NGOs to run digital campaigns, which might include the sending of messages to vulnerable individuals. Current campaigns include a pilot intervention in south Asia, where it has been building an app that will help a local mental health NGO mentor those at risk of being drawn into extremism.
“When an individual is engaging with violent extremist content online, they might be searching for this content on Google or posting this content on Facebook,” Frenett said. “They’ll see an advertisement, or receive a direct message, which offers counselling or social support. This is an entry for us. If we can get that person into a one-on-one conversation with a social worker, that’s the starting point for longer-term change.”
Mental health has also featured in past campaigns, such as one aimed at countering neo-Nazism in Australia, where individuals were provided with contact details for counselling.
In the UK, one pilot programme with a gym worked with young people at risk of being drawn into gangs. Vouchers to use the gym were used as part of a campaign that was based on Facebook adverts focused on those at risk of far-right radicalisation.