Facial recognition cameras prevent crime, protect the public and do not breach the privacy of innocent people whose images are captured, a police force has argued.
Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using facial recognition technology on him.
But Jeremy Johnson QC compared automated facial recognition (AFR) to the use of DNA to solve crimes and said it would have had little impact on Bridges.
Johnson, representing the police, said: “AFR is a further technology that potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public.”
The technology maps faces in a crowd and then compares them with a watch list of images, which can include suspects, missing people and persons of interest to the police. The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.
Johnson said the process also included human interaction. He said: “It is up to the operator to decide whether the person is a match or not. You then have the intervention.
“It’s not that the operator makes their own assessment, the officer on the ground looking at the individual will make their own assessment and will decide whether or not to intervene and speak to the individual.”
The hearing at Cardiff civil and family justice centre was told by Johnson that under common law police had the power to use visual imagery for the “prevention and detection of crime”.
It has been argued that the use of AFR is unregulated, but Johnson said police must adhere to data protection rules and have a code of practice for the management of information.
The court heard South Wales police did not believe article 8 of the Human Rights Act – which enshrines rights around private life – and the Data Protection Act had been breached by the use of CCTV or AFR cameras.
Johnson argued a police officer monitoring CCTV manually had the same “practical impact” on an individual as an AFR camera.
He said: “So far as the individual is concerned, we submit there is no difference in principle to knowing you are on CCTV and somebody looking at it.”
Johnson added that those not on a watch list would not have their data stored after being scanned by AFR cameras.
The court heard a trial period for the use of AFR started in south Wales in May 2017 and is still under way.
Bridges believes his face was scanned while he was shopping in 2017 and at a peaceful anti-arms protest in 2018, and that this had caused him distress. He has used crowdfunding to pay for the legal action with the support of the human rights organisation Liberty. It argues AFR has profound consequences for privacy and data protection rights.
But Johnson said: “It’s difficult to say that an automated immediate computerised comparison is more intrusive than police officers sitting down looking at albums of photographs.”
The court heard Bridges was not on a watch list. Johnson said: “He was not spoken to by a police officer, far less arrested. We say the practical impact on him was very limited.”
Johnson said AFR was used at the anti-arms trade protest in Cardiff in 2018, which was attended by Bridges. A woman had made a bomb threat at the same event last year and was therefore on a watch list, he added.
The barrister said: “It’s of obvious value for these police officers to know that person is there so that if another bomb threat is made they can deal with it accordingly. We say a fair balance has been struck.”
The case continues.