Editorial 

The Guardian view on fake video: a trick too far

Editorial: The spread of cheap and easy techniques for video faking is already damaging famous people. It risks destroying trust even at a community level
  
  

Silhouetted hand typing on laptop keyboard
‘Some of the most powerful techniques are within reach of anyone who takes the time and trouble to download the right program.’ Photograph: Andrew Brookes/Getty/Cultura RF

Last week a video was widely shared that apparently showed a balaclava-wearing “vigilante” attacking what was said to be a drug dealer’s car somewhere in Bolton. In the end, the police were able to show that the video was not real, but had been staged in order to suggest the police were not keeping the streets safe. It was a vivid illustration of how damaging fake video – fake views – can be, to individuals and also to communities. Then at the weekend, the American glossy Vanity Fair became the centre of an angry protest for extensively doctoring a group image of this year’s big stars. It is hardly a surprise to learn that images are untrustworthy; forensic analysis of still and video images has been around for decades. But fakes are getting smarter all the time. The latest adaptation of artificial intelligence can now create false video that is almost impossible to detect. The camera does not just lie, it lies really convincingly.

It is true that from the beginning, video and sound recordings have exploited their apparent incorruptibility to mislead: there has been film propaganda since the first world war, and later the work of, say, Leni Riefenstahl was profoundly dishonest in intent and execution. But the process of misleading was time-consuming and costly, and the difference between animations and real actors was obvious. Then Hollywood started to break down the barriers with creatures like Gollum appearing alongside real actors. Now dead actors can appear in films alongside their living colleagues.

Yet until very recently, such magic required a great deal of expensive and sophisticated computing power. But that is exactly what companies such as Google and Amazon are now making cheap and accessible. Some of the most powerful techniques are within reach of anyone who takes the time and trouble to download the right program. Using only a home computer they can graft the face of one person on to the body of another in a convincing video simulation. There are already 30,000 followers of the Reddit group where these dark crafts are practised.

This home technology has the potential to be uniquely damaging. Face-swapped pornography, one of the first uses to which it has been put, transposes the images of public figures. Videos have been made using pictures of actors, pop stars and celebrities in a grotesque assault on their integrity and reputation. But it could cause wider harm still if it is exploited to distort political argument, or inflame community difference, in this almost indetectable manner.

Whatever you want to believe, there will be video evidence to support your prejudice. If you wish to make a video of Barack Obama confessing that he was born in Kenya, now you can. If you’d prefer to make the famous lost video of Donald Trump in Moscow, that will be available, too, in several versions. This must lead to a general, corrosive growth of suspicion and distrust in society. Real evidence can and will be dismissed as entirely fake.

Innovation has often caused problems that it cannot fix. No one can unknow nuclear technology. At the same time, the construction of trust is an infinitely slower process than its destruction. Maybe no society could survive on the basis of universal truth-telling. But it is possible to create the conditions where trust can be encouraged to flourish. Scepticism is a healthy response to power. But where deception is practised so plausibly, scepticism tips into something darker. We need to think in terms of reasons to believe, if there is to be a chance of tipping the balance of public discourse back towards honesty.

 

Leave a Comment

Required fields are marked *

*

*