Unless you’ve been on a silent retreat for the past year, you will have almost certainly heard the rumours – that the pandemic is an elaborate hoax, or that the virus was created as a Chinese weapon, or that dangerous elites are trying to kill off the elderly and to establish a new world order, or that the symptoms are caused by 5G.
It is troubling enough to see these ideas on social media. But when you are hearing them from your family, your friends, or a casual acquaintance, it is even harder to know how to respond. You are going to struggle to convince the most committed believers, of course, but what about people who are only flirting with the ideas?
These difficult conversations are only set to increase now that a new vaccine is on the horizon. Certain niches of internet are already rife with the “plandemic” theory, which alleges that the spread of the virus has been designed to create big bucks for pharmaceutical companies and the philanthropist Bill Gates (whose charity is funding many of the efforts). The idea has been debunked numerous times, whereas there is good evidence that conspiracy theorists such as David Icke are themselves reaping huge profits from spreading misinformation. The danger, of course, is that their ideas will discourage people from taking the vaccine, leaving them vulnerable to the actual disease.
Since many conspiracy theories arise from feelings of uncertainty and fear, an angry debate will only cement the ideas, and open ridicule is even less constructive (see panel, below). Instead, the research shows that you should try to focus on the rhetorical devices and tricks of persuasion that have been used to spread the ideas in the first instance. “People seem receptive to you exposing the ways in which they may have been manipulated,” explains Dr Sander van der Linden at Cambridge University, who has pioneered research into the spread of misinformation and the ways to stop it.
Fortunately, the exponents of these conspiracy theories often use the same rhetorical devices, and a familiarity with these arguments will help you to politely articulate the faulty reasoning behind many different forms of misinformation. Read on to discover the five most common fallacies favoured by conspiracy theorists, and the best ways to respond.
1. Hunting an invisible dragon
In a memorable thought experiment, the astrophysicist and writer Carl Sagan described taking a visitor to see a fire-breathing dragon in his garage. Upon entering, the visitor was surprised to find an empty space – but Sagan replied that he had simply forgotten to mention that the dragon was invisible. The visitor then decides to throw a bag of flour on the floor to trace its outline – only to find out that it will be of no use because the dragon hovers off the ground. When the visitor suggests using an infrared camera, he is told that the dragon’s flames are heatless. There is no way, in other words, to either prove or falsify its existence.
This kind of argument is known as special pleading; you essentially move the goal posts whenever someone asks for evidence to prove your point – a tactic that is commonly used in many conspiracy theories.
With scientific results, it’s usual for new findings to be presented to other researchers to scrutinise the methods and results before they are presented in a journal like Nature, The Lancet, etc – a process known as peer review. But if you, for example, were to ask why there is no credible research proving the dangers of vaccines, the link between 5G networks and Covid-19 symptoms in humans, you may be told that there is a concerted effort to prevent such evidence from being released. Indeed, the absence of reliable evidence is itself taken as a proof of this conspiracy. The fact that major scientific institutions across the globe support the “mainstream” view only shows how good the cover-up has been.
Like Carl Sagan’s invisible, heatless, incorporeal dragon, this special pleading means that this misinformation can never be falsified in the eyes of the conspiracy theorist. If you are faced with this kind of reasoning, you might question the probability of arranging such a widespread conspiracy across so many organisations in so many countries without leaving any traces. Many people, after all, could benefit from exposing the plot – if it was supported by good evidence. (For a journal or newspaper, it would be the biggest scoop since Watergate – a truly world-changing piece of investigative journalism.) It might also be worth asking what kind of evidence would lead your acquaintance to change their mind – a simple prompt that could help to highlight the fact that the theory is essentially unfalsifiable.
2. Fake authority
If they can’t present any solid scientific evidence, conspiracy theorists may name impressive-sounding witnesses who apparently endorse their worldview.
A quick Google search will reveal that many of these names (or their supposed credentials) are completely fake. Alternatively, the talking head may be a real person with some expertise, but not within the relevant field – yet their opinions are painted as the authoritative take. A conspiracy theorist may be able to find a GP or a surgeon, say, who is willing to argue that the virus is a hoax for a few minutes of notoriety. But it’s worth questioning whether that rogue figure is as credible as the thousands of trained virologists who have studied its structure or the epidemiologists examining its spread.
You may see articles by Vernon Coleman, for instance. As a former GP he would seem to have some credentials, yet he has a history of supporting pseudoscientific ideas, including misinformation about the causes of Aids. David Icke, meanwhile, has hosted videos by Barrie Trower, an alleged expert on 5G who is, in reality, a secondary school teacher. And Piers Corbyn cites reports by the Centre for Research on Globalisation, which sounds impressive but was founded by a 9/11 conspiracy theorist.
Finally, some conspiracy theorists greatly exaggerate debates among experts themselves. Not all epidemiologists will agree on the best measures to reduce the spread of the virus, but this disagreement shouldn’t be used to justify the idea that the whole pandemic has been engineered by the government for some nefarious end.
Consider the so-called Great Barrington Declaration, an online document that argues we should aim for herd immunity, while protecting vulnerable people from infection. The authors of the original are three scientists, but the declaration was accompanied by a petition that did not verify the credentials of the signers, many of whom used false names or are real people with no expertise in this area. In reality, the document represents a fringe view, which is unsupported by most epidemiological research, and thousands of other researchers have rejected the basic premise of their argument that herd immunity is achievable without a vaccine. The declaration certainly doesn’t reveal widespread dissent among real experts, yet it is often cited by professional conspiracy theorists such as David Icke and “lockdown sceptics” such as Toby Young and Allison Pearson.
The tobacco industry used these tactics to great effect in the 1970s, with adverts that quoted fake experts and rogue scientists who questioned the harms of smoking.
“It’s a really persuasive form of misinformation,” says Prof John Cook, an expert in “science denial” at George Mason University. Fortunately, he has found that educating people about the history of this common deceptive tactic can make people more sceptical of other fake experts at a later point.
3. Coincidence or covert operations?
In September this year, the former Republican congressional candidate DeAnna Lorraine had a frightening epiphany. “I find it very interesting how the show The Masked Singer hit America in January 2019, a little bit over a year before they started forcing us all into masks. It’s almost like they were beginning to condition the public that masks were ‘normal’ and ‘cool’,” she wrote on Twitter. “The media is demonic.”
Most people had the good sense to dismiss Lorraine’s theory, but this tendency to claim some kind of causal connection from a random coincidence has given birth to many other unfounded ideas. “Conspiracy theorists tend to take a grain of truth, then cast another narrative around it,” says Van der Linden.
The fact that 5G arrived at roughly the same time as coronavirus, for instance, is not evidence that its electromagnetic waves caused the disease. As Cook points out, the character Baby Yoda also arrived in late 2019 – but who would claim that he had caused widespread illness?
The problem of over-reading coincidences might explain why many people still believe that the MMR vaccine can lead to autism. We now know that Andrew Wakefield’s original paper proposing the link was fraudulent, and based on fabricated data. The problem is that the typical signs of autism often become more apparent in a child’s second year, around the same time they receive the vaccine. This is just a coincidence, but some people believe it offers evidence for the theory – despite the fact that large studies have repeatedly shown that autism is no more common among vaccinated children than unvaccinated children.
Similarly, you may be given reports of Bill Gates discussing the possibility of a global pandemic long before 2020 – which some, like Piers Corbyn, have taken as evidence for the “plandemic” theory. In reality, the risk of a novel disease entering circulation has been a serious concern for many years, and many organisations, not just Gates’s charities, had been preparing for the eventuality. In this case, you could just as easily point to the 2011 film Contagion and argue that director Steven Soderbergh has been plotting the whole thing.
4. False equivalence
When you hear an analogy between two separate scenarios, be aware that you may be comparing apples and oranges.
You might have heard the argument that “we have thousands of deaths from car crashes each year – yet we don’t shut down the country to prevent those”. The problem, of course, is that car crashes are not contagious, whereas a virus is, meaning that the number of infected people can grow exponentially until it overwhelms the health service. While there may be a nuanced debate over the most effective ways to prevent that scenario, these kinds of false analogies are used to completely dismiss the need to prevent contagion, allowing the conspiracy theorist to assign a more sinister intent for any new measures.
Cook says that this is one of the most commonly used fallacies, but it’s easy to identify. “Look at the differences between the two things being compared, and if that difference is important for the conclusions, then it’s a false equivalence.”
5. The thought-terminating cliche
I was recently discussing the contagion’s exponential growth with a member of my own family. He was sceptical. “You can prove anything with data,” he told me. “It’s all lies, damned lies and statistics.” This is known as a thought-terminating cliche, in which a proverb or saying is used to end further discussion of a point without addressing the argument itself.
At this point, it’s probably time to leave the discussion for another day. As Van der Linden points out, the important thing is to maintain the possibility of continued open dialogue. “We need to have repeated conversations in an environment of mutual respect.” To quote another cliché, it is sometimes best to agree to disagree.
The art of pre-suasion
If you want to change someone’s mind, you need to think about “pre-suasion” – essentially, removing the reflexive mental blocks that might make them reject your arguments.
The first step is to establish empathy. “Often, these people are very worried about something and this issue is important to them,’ says Prof Karen Douglas, a psychologist who studies conspiracy theories at the University of Kent. “It would not be constructive to go into the conversation in a hostile manner, because this delegitimises their concerns and might alienate them even more.”
Douglas advises that you make the effort to understand the origins of their beliefs, a point of view that Cook also holds. “You want someone to articulate what they’re thinking, and why they’re thinking it, in a non-confrontational way,” he says. When describing the theories, they may have already noticed some of the contradictions and holes in the logic. If not, you will at least be in a more informed position to start a constructive discussion.
It may be worth acknowledging the fact that certain conspiracies – like Watergate – have occurred in the past, but they were supported by incontrovertible evidence rather than rumour and supposition. “It can validate people’s worldview,” says Van der Linden. And that, he says, might offer a “gateway” that will render them more open to your arguments.
You might also talk about people within the “movement” who have since changed their views. There are now, for example, many reports of erstwhile Covid-19 deniers who have since contracted the disease and renounced their former beliefs – and their experiences may be more persuasive than your own opinions.
David Robson is a science writer and author of The Intelligence Trap: Revolutionise Your Thinking and Make Wiser Decisions (Hodder & Stoughton £9.99). To order a copy go to guardianbookshop.com. Delivery charges may apply