Earlier this year, Amazon’s facial recognition tech Rekognition—which can scan photos and videos and match them against databases of faces—faced a flood of criticism from Amazon employees, shareholders, and civil rights activists who argued that its use by government agencies was unethical or even dangerous. That backlash was spurred by an ACLU report showing how Amazon was working with law enforcement to market its surveillance tech—and now new documents provide more insight into how Amazon is giving this controversial tool to police.
On Friday, Buzzfeed News published documents obtained through a public records request further detailing the relationship between Amazon and the Orlando Police Department, which participated in a pilot program of Rekognition’s real-time facial recognition software. In June, the police department announced that its initial contract with Amazon had expired, but in July, the city said that it will continue testing Rekognition.
According to BuzzFeed, Amazon provided the city of Orlando with “tens of thousands of dollars of technology” for free. This supports a Washington Post report that found Oregon’s Washington County—whose police also piloted Rekognition—paid only $6 to $12 a month for the service. That’s around how much it costs for a Netflix subscription.
Additionally, the new documents reveal that Amazon required Orlando to sign a nondisclosure agreement (NDA) that would keep details about the pilot under wraps. The ACLU previously discovered Washington County similarly signed an NDA with Amazon. Nondisclosure agreements are par for the course for tech companies, and Amazon claims that so is affording people with free pilots of their services.
“Providing customers with an opportunity to test technology with free credits is a common practice in the industry and something we offer to many of our customers with various AWS services,” an Amazon Web Services spokesperson told Gizmodo in a statement. “Talking to organizations about products and new features under a non-disclosure agreement is also something we do frequently with many of our customers for the purposes of protecting intellectual property and competitive information. We continue to support our customers in the responsible use of the technology which includes providing publicly available best practices and documentation as well as ongoing guidance from our machine learning experts, all of which is standard for customer engagements.”
Perhaps the most interesting information in the new documents, however, is how Orlando officials struggled to make Amazon’s technology work. It seems law enforcement teams have been given this widely criticized technology with little—if any—training. Sgt. Eduardo Bernal of the Orlando Police Department told BuzzFeed that Amazon didn’t provide any in-person instructions on how to use Rekognition, including on what photos they can upload to the database to create its list of suspected criminals.
Emails included in the documents suggest that the pilot program was far from flawless, with internal and public communications about it riddled with slip-ups and confusion from the very beginning. A February email shows a city official complaining about the late delivery of Amazon’s tech. Once it was set up, the problems apparently continued.
“The streams keep stopping….seems like this happens daily,” a March email from an Orlando official reads. “I started 4 or 5 streams the other day and as you said, now only 1 is still up. I thought you were working on a script to automatically restart them if there were issues?”
It’s unsurprising the program hasn’t worked all of its kinks out yet. We saw the troubling implications of the system’s flaws in July, when the ACLU of Northern California revealed that Rekognition incorrectly flagged 28 members of Congress as people suspected of crimes. Of the 28 misidentified members, 11 were people of color. What’s more, the error rate for non-white members of Congress was 39 percent, compared to five percent for Congress as whole. The test cost just $12.33.
That’s a crucial driving factor behind demands for Amazon to cancel its facial recognition contracts with law enforcement officials—the technology is currently flawed, biased, and, in the wrong hands, a powerful tool for discrimination. Currently, Orlando says it will only test the software by matching faces against photos of volunteer police officers, but it’s easy to imagine more dire consequences if use of this technology is expanded.
“Any company in this space that willingly hands this software over to a government, be it America or another nation’s, is willfully endangering people’s lives,” Brian Brackeen, CEO of the face recognition and AI startup Kairos, wrote in an op-ed in June. “We need movement from the top of every single company in this space to put a stop to these kinds of sales.”