Jim Waterson Political media editor 

British female politicians targeted by fake pornography

Leading politicians victimised by online material including AI deepfakes, investigation finds
  
  

Angela Rayner and Penny Mordaunt on ITV's election debate.
Victims include Labour’s deputy leader, Angela Rayner (left) and the Commons leader, Penny Mordaunt (right). Photograph: Getty Images

British female politicians have become the victims of fake pornography, with some of their faces used in nude images created using artificial intelligence.

Political candidates targeted on one prominent fake pornography website include: the Labour deputy leader, Angela Rayner; the education secretary, Gillian Keegan; the Commons leader, Penny Mordaunt; the former home secretary, Priti Patel; and the Labour backbencher Stella Creasy, according to Channel 4 News.

Many of the images have been online for several years and attracted hundreds of thousands of views.

While some are crude Photoshops featuring the politician’s head imposed on to another person’s naked body, other images appear to be more complicated deepfakes that have been created using AI technology. Some of the politicians targeted have now contacted police.

Dehenna Davison, a Conservative MP until the recent dissolution of parliament, is one of those featured on the site. She told Channel 4 News it was “really strange” that people would target women like her and she found it “quite violating”.

She said that unless governments around the world put in place a proper regulatory framework for AI, there would be “major problems”.

Creasy told the broadcaster that she felt “sick” to learn about the images and that “none of this is about sexual pleasure, it’s all about power and control”.

Nonconsensual deepfake technology, which takes a photograph of an individual and uses artificial intelligence to strip clothes or create a fake nude photo, has become a growing issue as part of the wider AI boom.

Earlier this year the Guardian investigated ClothOff, an AI app that invites users to “undress anyone using AI”, which channelled its transactions through a company registered in London, and has caused chaos in some schools.

Thousands of female celebrities are already victims of fake pornography. The site featuring the female British politicians, which the Guardian has not named, features user-created content and claims to only host lawful content featuring adults.

Since the Online Safety Act was introduced in January, sharing such imagery without consent has been illegal in the UK. Yet sites hosting this material are easily accessible through mainstream search engines such as Google.

The creation of such material also remains legal in the UK. The government in April announced plans to close this loophole and ban the creation of deepfake pornography in England and Wales but the proposed law was dropped when Rishi Sunak decided to call an early election.

The Conservatives, Labour and the Liberal Democrats have pledged to bring it back if they win the next election, meaning it is likely that creation of the images will also be banned.

The UK’s stance on deepfake pornography is tougher than many other countries. This has already had an impact, with some of the biggest sites choosing to pre-emptively block British users from their sites rather than risk the potential legal ramifications.

The US representative Alexandria Ocasio-Cortez is pushing for similar laws in the US. She said encountering a deepfake of herself performing a sex act resurfaced past trauma and predicted that “people are going to kill themselves over this”.

 

Leave a Comment

Required fields are marked *

*

*