Police Wrongly Arrested a Black Man Using Racist Facial Recognition Technology

Illustration for article titled Police Wrongly Arrested a Black Man Using Racist Facial Recognition Technology
Image: Bill Pugliano (Getty)

In a spectacularly rare admission of likely very common fuckery, police copped to using face recognition to make a wrongful arrest, according to the ACLU, confirming a long-suspected but nearly-impossible-to-prove practice. On Wednesday, the civil rights litigation group lodged a complaint against the Detroit Police Department for allegedly arresting Robert Williams in early January and admitting, only after holding him in a crowded cell overnight, that “the computer must have gotten it wrong.” Williams is Black, and facial recognition software notoriously misidentifies people of color.

Advertisement

In a New York Times story today, reporter Kashmir Hill relates how police arrested Williams, on his lawn and in front of his family, for allegedly stealing five watches valued at $3,800 from a store in October 2018. According to the complaint, the Detroit police had used “blurry” security footage of the crime and matched it to Williams’s driver’s license photo, which they then showed to a security guard who didn’t witness the incident.

The police didn’t give him any reason for the arrest, and told Williams’s wife to “Google it” when she asked where they were taking him. After a night in jail, police interrogated Williams, showed him a surveillance image from the scene, and asked if it was him. It was not, he said, at which point a detective reportedly said, “I guess the computer got it wrong.” According to the Times, Williams was held for several hours after the interrogation.

Advertisement

A Detroit police spokesperson told the Times that the department “does not make arrests based solely on facial recognition,” and that there were witness interviews, a photo lineup, and that “the investigator reviewed video.” That doesn’t square with a subsequent response from the Wayne County prosecutor’s office, confirming that the Detroit Police Department used facial recognition to identify Williams based on the security footage, and that an eyewitness to the crime was not shown the photo line-up (again, based on the facial recognition match).

The office adds that Williams can have the case expunged and his fingerprints removed from the record. In the statement, Prosecutor Kym L. Worthy said that in 2019, the DPD had asked her to adopt their facial recognition “policy” and declined because of the faulty technology and its built-in racial bias. “This case should not have been issued based on the DPD investigation, and for that we apologize,” Worthy wrote. “This does not in any way make up for the hours that Mr. Williams spent in jail.”

Reuters reported that it reviewed government documents showing that the police made the match using the Michigan state police’s digital analysis identification section, which reportedly uses facial recognition technology from Rank One Computing. The section’s website notes that it pulls from the Statewide Network of Agency Photos, which specifically states on its own site that facial recognition “should never be considered a form of positive identification.”

As the ACLU points out, this instance is exceptional in part because police never admit to using facial recognition. “Had Robert not heard a glib comment from the officer who was interrogating him, he likely never would have known that his ordeal stemmed from a false face recognition match,” two ACLU attorneys familiar with Williams’ case wrote. “In fact, people are almost never told when face recognition has identified them as a suspect. The FBI reportedly used this technology hundreds of thousands of times — yet couldn’t even clearly answer whether it notified people arrested as a result of the technology.” The police obstructed numerous attempts by Williams’ lawyer to obtain documentation on the arrest.

Advertisement

We’ve known that police have used facial recognition to make arrests in the past; in 2016, the ACLU uncovered an internal marketing document from the social media location tracking company Geofeedia, which bragged that its technology enabled police to use facial recognition to identify Freddie Gray “rioters” with outstanding warrants and arrest them from the crowds. The implications are boundless; just after protests began over George Floyd’s death last month, the ACLU filed a lawsuit against the facial recognition company Clearview AI, which provides law enforcement agencies and private companies the ability to match (or “match”) faceprints scraped from social media and even Venmo accounts.

Rank One told Gizmodo that the use of face recognition in Williams’s case “goes against established best practices for forensic face recognition” and that the company “unreservedly opposes” the idea of using facial recognition as the sole basis for arrest. Moving forward, we will add legal means to prevent uses of our software which violate our Code of Ethics,” they said. They added that they’ll be conducting a technological review of software safeguards to prevent future misuse.

Advertisement

The ACLU demands in its complaint that the Detroit Police Department stop using facial recognition in its investigations, “as the facts of Mr. Williams’ case prove both that the technology is flawed and that DPD investigators are not competent in making use of such technology.”

The Detroit Police Department and the ACLU were unavailable for comment.

Note: This post has been updated to include comment from Rank One.

Advertisement

Staff reporter, Gizmodo. wkimball @ gizmodo

Share This Story

Get our newsletter

DISCUSSION

So, I’m still confused. The cops review the matches after the system pops it out. It isn’t like they send out an automated drone to pick people up after the system finds a match.

Are we upset that the tech found the person as a possible match, or that the cop didn’t look carefully enough?

To me making it easier for an investigator to find the person that broke into my house/car/work is good, as long there is still the human intervention.