First independent study of the London force's use of facial recognition has been released
Four out of five people identified by the Metropolitan Police's facial recognition technology as possible suspects are innocent, according to an independent report.
Researchers found that the controversial system is 81% inaccurate - meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list.
The force maintains its technology only makes a mistake in one in 1,000 cases - but it uses a different measurement to arrive at this conclusion.
Citing a range of technical, operational and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court.
The Met has been monitoring crowds with live facial recognition (LFR) since August 2016, when it used the technology at Notting Hill Carnival.
Since then, it has conducted 10 trials at locations including Leicester Square, Westfield Stratford, and Whitehall during the 2017 Remembrance Sunday commemorations.
The first independent evaluation of the scheme was commissioned by Scotland Yard and conducted by academics from the University of Essex.
Professor Pete Fussey and Dr Daragh Murray evaluated the technology's accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct - an error rate of 81%. Four of the 42 were people who were never found because they were absorbed into the crowd, so a match could not be verified.
The co-authors also found "significant" operational problems - with obtaining the consent of those affected a particular issue.
When live facial recognition is used in public places, everyone who comes within range of the cameras is considered to be under overt surveillance.
The Met did make an effort to notify passers-by about their trials by putting out signs and tweeting about the event.
But the researchers observed "significant shortcomings" in this process - and said this created difficulty in gaining meaningful consent.
Professor Fussey and Dr Murray wrote: "Treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent.
"The arrest of LFR camera-avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of 'surveillance creep'."
Their report also criticised the Met's use of "watch lists" - the registers of "wanted" people that facial recognition is supposed to help locate.
According to the report, the data used to create watch lists was not current, so people were stopped even though their case had already been addressed. In other cases, there were no clear reasons why people were put on watch lists, leaving "significant ambiguity" about the intended purpose of facial recognition.