Researchers at the Georgetown Center of Privacy and Technology have filed suit against the NYPD for more details on the department’s highly secretive face recognition program.
In 2016, the researchers sent Freedom of Information Act requests to the NYPD as they prepared to release the Perpetual Lineup, a landmark report by the university on law enforcement and face recognition technologies. But since 2016, the NYPD has alternately claimed it either couldn’t find any relevant records or that the records it did find were too sensitive to be released.
“The technology is not just a counterterrorism measure,” Clare Garvie, one of the Georgetown researchers suing for more information, told Gizmodo. Garvie co-authored “The Perpetual Lineup” and the 2017 follow-up focusing on face recognition in airports.
“The NYPD has indicated that they use it in routine investigations, including for misdemeanors,” she says. “It’s something the public deserves to know about, particularly the controls that are in place against misuse [and] how they’re controlled to avoid invasions of privacy and impacts on civil rights.”
The nearly 500 pages of documents the NYPD handed over to researchers were blanketed in redactions, revealing almost nothing about the program. Sentences explaining the scope of what face recognition system exists, its intended uses, and so on, are simply redacted into a blank void, leaving the Georgetown researchers with little more than what they started with. Most crucially, Garvie wants to know what checks are in place to protect people.
“They’ve had at least five misidentifications,” Garvie said. “And we wanted more information about that. What happened with these individuals? How far do the investigations go? And what checks are in place to avoid misidentifications to avoid charging, arresting, prosecuting an innocent individual?”
Here’s what the researchers have been able to piece together about the NYPD’s “Forensic Imaging System.” It’s apparently a biometric database made up of mugshots from arrests and iris and thumbprint data. Based on the few records the NYPD has released, it’s believed to connect biometric data with arrest data, so when someone is arrested, their face, eyes, and thumbprints would be scanned and added to a searchable database with information like if they’d been arrested before, their rap sheet, etc.
The documents reference training manuals and audits, but the NYPD has so far refused to release them to Georgetown. Interestingly, the NYPD has also claimed it had no information on training and audit programs that agency representatives made public references to in conversations about crime prevention and counterterrorism.
“It may be a case where the right hand is not necessarily talking to the left,” Glare says, “[or] they don’t have a coordinated strategy around what gets disclosed and what does.”
Georgetown researchers have return court dates in both April and May to petition a judge to force the NYPD to hand over legible versions of the documents. While Garvie hopes this will answer necessary questions about the scale of the NYPD’s program, questions remain about the underlying algorithm, including how it’s been designed and whether there are racial disparities in accuracy, a key concern raised in The Perpetual Lineup report.
“It really comes down to how robust the algorithm is and whether [it’s] been created in such a way to mitigate these accuracy concerns,” said Garvie. “Unfortunately, because these algorithms are all proprietary, we really have no transparency into what the companies themselves are doing to mitigate these risks.”