Rarely a day goes by without reference to “propaganda”. The outgoing German ambassador to the UK regrets there was “so much propaganda in the media” at the time of the 2016 Brexit referendum. Turkish authorities have this week put 300 more people in jail for “propaganda”. Reuters recently reported that “Twitter may notify users exposed to Russian propaganda during the 2016 US election”. But how to define it?
A recent discussion in Oxford with a group of young Russian journalists and civil society activists brought some interesting insights, especially for those of us lucky enough to live in democracies. The seminar was organised by the Moscow School of Civic Education, which was founded in the 1990s to promote critical thinking and democratic values among the younger, post-Soviet generation. In 2015 the school was closed down in Moscow after it was labelled a “foreign agent”. It survives in exile, registered as a UK-based charity.
It was fascinating that the young Russians who had travelled to Oxford from as far away as eastern Siberia were much more resilient to propaganda than I had expected. As one young Russian put it to me: “We’ve got used to the propaganda. We simply try to live aside from it.” This may well have been a selected sample – a minority of well-educated, connected people. But what was encouraging was that their world outlook seemed partly immune to state propaganda precisely because they have developed a capacity to detect it.
Perhaps there are lessons for those of us in Europe and in the US who worry about Russian meddling and other disinformation. Perhaps there are antidotes to propaganda that go beyond the necessary effort of checking what your sources of news are and making sure they are reliable. Citizens who live in an authoritarian, disinformation-filled environment deal daily with the reality of propaganda in ways we can’t fully experience, because we live outside it.
Most of us tend to focus on how disinformation spreads across our societies – the bots, the trolls, the technological machinery of “fake news”. We spend perhaps too little time thinking about the very essence of propaganda: the ingredients that go into a dish meant to captivate us, play on our emotions, and control what we think. If you are able to break those ingredients down, you become less vulnerable.
In the late 1930s, an American professor at Columbia, Clyde R Miller, drew up a list of criteria that he believed defined propaganda. The young people I met had fun identifying these ingredients among examples of what their state media and politicians constantly spew out to the public. The more I listened to them, the more I thought: we need this to be taught or shown to young people across our democracies. Developing resilience to propaganda can’t be done only through denouncing it, or just by creating fact-checking mechanisms or websites that debunk “post-truth” – however useful and necessary those efforts are. Providing the tools to identify propaganda assists action upstream, before it even starts exerting its influence. First, there’s the “name-calling” propaganda that systematically attaches labels to what it wants to condemn, forming its judgments without examining any evidence and with labels to match: “fascist”, “red”, “terrorist”.
Next come the “glittering generalities”: propaganda that strives to associate itself with shining ideals whatever the contrary evidence. It suggests that if you’re good and virtuous, you will believe it. Then there is “transfer”: propaganda that carries over the authority and prestige of something we respect to something it wants us to absorb – in Russia, for example, the “great patriotic war” and Vladimir the Great are constantly recycled. Add to that the “testimonial” device: propaganda that seeks to secure approval from prominent names, or “useful idiots”, confident that people will follow a leader.
Utilising the “plain folks” device, propaganda strives to sound and look like something common, close to the people, which can include vulgar language, or an anti-elitist slant: it’s one of us. There’s “card-stacking”; when propaganda selects only those sets of facts that support its assertions, and then the “bandwagon” device. Follow the crowd, it says: everyone thinks this, so why stand out?
We could add “whataboutism” to the list: the art of minimising negative aspects or crimes (such as the bombing of civilians in Aleppo) by pointing to what others have done elsewhere (the US army in Iraq, or discrimination in the US). Lumping everything together feeds confusion and indifference.
Last year, in a study of organised social media manipulation across 28 countries, two Oxford academics made this observation: “Every authoritarian regime has social media campaigns targeting their own populations, while only a few of them target foreign publics. In contrast, almost every democracy has organised social media campaigns that target foreign publics, while political party-supported campaigns target domestic voters.”
“Propaganda” rears its head in so many countries in so many ways. It reached unprecedented prevalence and refinement in the totalitarian regimes of the 20th century. Now it’s a new normal in Russia, Turkey and arguably Poland, all states that have experienced various stages of democratic backsliding in recent years.
The use of propaganda is ancient, but never before has there been the technology to so effectively disseminate it, and rarely has the public mood been so febrile. If identifying lies and distortions was desirable before, isn’t it now essential self-defence?
• Natalie Nougayrède is a Guardian columnist