Police Face Recognition Mistakes 2,000 As Potential Criminals
If you thought police surveillance using face scanning glasses was a nightmare that could only happen in China, better start reaching for your security blanket. Last year in South Wales, the police tested its fancy new face recognition technology on spectators at the 2017 Champions League march between Real Madrid and Juventus. But unlike how the Chinese government may be singing praises about its success, the Welsh police is a bit more forthcoming with the over 2,000 false positives its system made.
Around 170,000 people added that game and the Automated Facial Recognition system or AFR yielded identified 2,470 as potential criminals. Out of that large pool, however, 2,297 or 92% were found to be false positives. While no face recognition is perfect, that's still a pretty bad number.
Welsh police were quick to defend the success of their AFR system. It claims that in the past 9 months it has been in use, it has led to 450 arrests and successful convictions. Not once was there a wrongful arrest made based on a false positive and the force has not received any complaint from the public. Part of that is thanks to the police's use of common sense. It investigates the identified potential criminal before making an arrest.
Unsurprisingly, not everyone is convinced end, nor the successful results, justify the potential privacy violations of scanning people, often without their knowledge much less their consent. It is pretty much the realization of the dystopian police state. Fortunately or unfortunately, South Wales police didn't make use of advanced surveillance spectacles like in China and ended up with poor quality images which, it explained, is part of the reason for the AFR's errors.
VIA: The Guardian