Facial recognition software, more accurate, but fair?

Joe Purshouse
Wednesday, 28 November 2018

The report from the Universities’ Police Science Institute provides welcome evaluation of the South Wales Police’s use of live facial recognition technology. Until today, the police were using this technology at large public gatherings in the absence of robust evidence suggesting that it was of any use.

There are some positives for the police to take away from the report (opens in PDF). It suggests that the technology is getting more accurate as algorithms are improving and officers are becoming more familiar with the facial recognition system in South Wales.

However, concerns remain about the practical usefulness of the technology. The report showed that the technology does not work well in the dark, and could be evaded if dangerous individuals simply cover their faces as they pass the facial recognition cameras.

There are still pressing questions concerning whether the technology is value for money. The numbers of arrests made with the assistance of facial recognition technology are low, and the cost of the trial in South Wales ran close to £2 million.

Moreover, the report provides no conclusive answer to questions concerning the legality of the police use of live facial recognition surveillance.

Currently the police have wide-reaching discretion to decide whom they wish to include on their ‘watchlists’ for facial recognition purposes, and can use images of those who have no previous convictions. Other research on police facial recognition suggests that this technology has major implications for people’s privacy rights, and its use can worsen existing biases and discrimination in policing practices.

Privacy is impinged because people are subject to biometric surveillance as they go about their business in public. The data and images will also be stored for a certain period of time, in circumstances which the individual cannot manage or control.

In addition, facial recognition technology can expose people to potential discrimination as research indicates that ethnic minorities are misidentified by facial recognition technology at higher rates than the rest of the population.

This inaccuracy may lead to members of certain groups being arbitrarily subjected to stops or arrests, and their data being retained inappropriately.

Joe Purshouse is a lecturer in criminal law at the University of East Anglia. His specialist interests include privacy, facial recognition surveillance, DNA, criminal records, online vigilantism