John Naughton 

Facial recognition firms should take a look in the mirror

Clearview AI was fined for using internet-sourced images of UK residents in its database – but not before police forces used its service
  
  

Clearview AI CEO Hoan Ton-That demonstrates the company's facial recognition software.
Clearview AI CEO Hoan Ton-That demonstrates the company's facial recognition software. Photograph: Seth Wenig/AP

Last week, the UK Information Commissioner’s Office (ICO) slapped a £7.5m fine on a smallish tech company called Clearview AI for “using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition”. The ICO also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet and to delete the data of UK residents from its systems.

Since Clearview AI is not exactly a household name some background might be helpful. It’s a US outfit that has “scraped” (ie digitally collected) more than 20bn images of people’s faces from publicly available information on the internet and social media platforms all over the world to create an online database. The company uses this database to provide a service that allows customers to upload an image of a person to its app, which is then checked for a match against all the images in the database. The app produces a list of images that have similar characteristics to those in the photo provided by the customer, together with a link to the websites whence those images came. Clearview describes its business as “building a secure world, one face at a time”.

The fly in this soothing ointment is that the people whose images make up the database were not informed that their photographs were being collected or used in this way and they certainly never consented to their use in this way. Hence the ICO’s action.

Most of us had never heard of Clearview until January 2021 when Kashmir Hill, a fine tech journalist, revealed its existence in the New York Times. It was founded by a tech entrepreneur named Hoan Ton-That and Richard Schwartz, who had been an aide to Rudy Giuliani when he was mayor of New York and still, er, respectable. The idea was that Ton-That would supervise the creation of a powerful facial-recognition app while Schwartz would use his bulging Rolodex to drum up business interest.

It didn’t take Schwartz long to realise that US law enforcement agencies would go for it like ravening wolves. According to Hill’s report, the Indiana police department was the company’s first customer. In February 2019 it solved a case in 20 minutes. Two men had got into a fight in a park, which ended with one shooting the other in the stomach. A bystander recorded the crime on a smartphone, so the police had a still of the gunman’s face to run through Clearview’s app. They immediately got a match. The man appeared in a video that someone had posted on social media and his name was included in a caption on the video clip. Bingo!

Clearview’s marketing pitch played to the law enforcement gallery: a two-page spread, with the left-hand page dominated by the slogan “Stop Searching. Start Solving” in what looks like 95-point Helvetica Bold. Underneath would be a list of annual subscription options – anything from $10,000 for five users to $250,000 for 500. But the killer punch was that there was always somewhere a trial subscription option that an individual officer could use to see if the thing worked.

The underlying strategy was shrewd. Selling to corporations qua corporations from the outside is hard. But if you can get an insider, even a relatively junior one, to try your stuff and find it useful, then you’re halfway to a sale. It’s the way that Peter Thiel got the Pentagon to buy the data-analysis software of his company Palantir. He first persuaded mid-ranking military officers to try it out, knowing that they would eventually make the pitch to their superiors from the inside. And guess what? Thiel was an early investor in Clearview.

It’s not clear how many customers the company has. Internal company documents leaked to BuzzFeed in 2020 suggested that up to that time people associated with 2,228 law enforcement agencies, companies and institutions had created accounts and collectively performed nearly 500,000 searches – all of them tracked and logged by the company. In the US, the bulk of institutional purchases came from local and state police departments. Overseas, the leaked documents suggested that Clearview had expanded to at least 26 countries outside the US, including the UK, where searches (perhaps unauthorised) by people in the Met, the National Crime Agency and police forces in Northamptonshire, North Yorkshire, Suffolk, Surrey and Hampshire were logged by Clearview servers.

Reacting to the ICO’s fine, the law firm representing Clearview said that the fine was “incorrect as a matter of law”, because the company no longer does business in the UK and is “not subject to the ICO’s jurisdiction”. We’ll see about that. But what’s not in dispute is that many of the images in the company’s database are of social media users who are very definitely in the UK and who didn’t give their consent. So two cheers for the ICO.

What I’ve been reading

A big turn off
About Those Kill-Switched Ukrainian Tractors is an acerbic blog post on Medium by Cory Doctorow on the power that John Deere has to remotely disable not only tractors stolen by Russians from Ukraine, but also those bought by American farmers.

Out of control
Permanent Pandemic is a sobering essay in Harper’s by Justin EH Smith asking whether controls legitimised by fighting Covid will ever be relaxed.

Right to bear arms?
In Heather Cox Richardson’s Substack newsletter on the “right to bear arms”, the historian reflects on how the second amendment has been bent out of shape to meet the gun lobby’s needs.

 

Leave a Comment

Required fields are marked *

*

*