Law enforcement officials are still banned from using Amazon’s facial recognition technology “until further notice,” the company said on Wednesday, a decision that effectively extends a yearlong moratorium that had been set to expire on June 1.
Known as Rekognition, the program in question has been widely criticized over the years for its dubious efficacy — it once incorrectly identified 28 members of Congress as criminals — and its tendency to, surprise surprise, disproportionately misidentify women and people of color. In 2019, two independent studies concluded that Rekognition’s facial recognition software did, in fact, return inaccurate or biased results, and some police precincts even objected to using the software on the grounds that it gave off “a Big Brother vibe.”
But despite those concerns, Rekognition was still being utilized by law enforcement until June 2020, when Amazon finally bowed to growing calls from its own employees who worried that the program’s documented biases could lead to increased police violence against marginalized groups. In heeding those calls, Amazon announced the yearlong moratorium on use of the tech by police clients, which prompted rival Microsoft to similarly announce that it would put a hold on the use of its facial recognition tech by police until there were federal regulations governing its use.
In a statement issued at the time of the moratorium, Amazon said that it had “advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology.”
“We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” the statement continued.
But while a number of states and localities have taken it upon themselves to ban or restrict the use of facial recognition tech by police in recent years — San Francisco became the first in 2019, and was followed soon after by Oakland, Oregon, Maine and Massachusetts — no federal legislation regulating its use currently exists.