Amazon’s controversial face recognition software, called Rekognition, misidentified more than two dozen members of Congress as people arrested for crimes. The false identifications were made when the ACLU of Northern California tasked Rekognition with matching photos of all 535 members of Congress against 25,000 publicly available mugshot photos. The test cost the ACLU just $12.33 to perform.
In total, Rekognition misidentified 28 members of Congress, listing them as a “match” for someone in the mugshot photos. 11 of the misidentified members of Congress were people of color, a highly alarming disparity. Tests have shown that face recognition is routinely less accurate on darker skinned people and women. For Congress as a whole, the error rate was only five percent, but for non-white members of Congress, the error rate was 39 percent.
Among those misidentified were six members of the Congressional Black Caucus, who wrote an open letter to Amazon CEO Jeff Bezos in June after the ACLU released internal documents concerning the use of Rekognition by police.
Amazon has encouraged law enforcement agencies to adopt Rekognition, with police in Orlando recently opting to continue their pilot of the tech. In a blog post, the ACLU of Northern California explained how misidentification in the hands of law enforcement could be deadly, particularly for people of color:
If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins.
In a statement, Amazon suggested ACLU’s results with the software “could probably be improved by following best practices,” saying the company recommends law enforcement agencies set a confidence threshold of at least 95 percent when deciding if there’s a match. Of course, there’s no legal requirement for police to do so. In the UK, police in South Wales recently defended using face recognition software with a staggering 90 percent false positive rate.
Amazon’s statement is included in full below:
We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft). We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement.
With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test. While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a higher threshold of at least 95% or higher.
Finally, it is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgement (and not to make fully autonomous decisions), where it can help find lost children, restrict human trafficking, or prevent crimes.