Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Clearview AI Reportedly Worked On a Mug Shot Repository to Go With Its Face Recognition App

Illustration for article titled Clearview AI Reportedly Worked On a Mug Shot Repository to Go With Its Face Recognition App
Photo: Dhruv Mehrotra (Gizmodo )

The controversial facial recognition firm Clearview AI has already landed in hot water for purportedly letting both law enforcement and rich investors play around in its database of billions of photos scraped from the public internet. However, a new report suggests the company didn’t just stop there. Apparently, at one point Clearview AI aimed to compile a nationwide repository of mug shots from the last 15 years.

Advertisement

According to an investigation by OneZero, the company mentioned its ongoing work on the database when responding to an inquiry from Wisconsin law enforcement officials last August. In an email obtained by the outlet, a Clearview AI representative, whose name was redacted, told the Green Bay Police Department the following:

“Regarding the mugshot database: there is no way to do that currently, but we have a ‘Gallery’ feature in development that will make that possible. We are also working to acquire all U.S. mugshots nationally from the last 15 years, so once we have that integrated in a few months’ time it might just be superfluous anyway.”

Advertisement
Illustration for article titled Clearview AI Reportedly Worked On a Mug Shot Repository to Go With Its Face Recognition App
Screenshot: OneZero

Even creepier still, it remains unclear what became of this project. While scrubbing through the most recent Android version of Clearview AI’s app last week, Gizmodo reporters discovered bits of code hinting at several features that appeared to be under construction, such as a “private search mode” and voice search options, though nothing that explicitly pointed to a developing mug shot database. However, the app requires a user account to access Clearview’s full repertoire, aka the purportedly extensive face recognition system and toolkit available to its clients, so such a repository could theoretically be beyond that paywall. Clearview AI did not immediately respond to a request for comment from Gizmodo. OneZero was reportedly met with similar radio silence.

Clearview AI has garnered quite a bit of notoriety recently over its dubious data collection methods as well as a recent data breach that drew significant criticism from lawmakers. As first reported by the New York Times, the facial recognition firm has purportedly scraped an estimated 3 billion photographs from sites like Facebook, Google, and YouTube.

According to emails obtained by OnZero, some of those images also came from various online mug shot repositories like Rapsheets.org and Arrests.org. Given its mission “to help law enforcement agencies solve the toughest cases,” per Clearview’s website, creating a proprietary mug shot database would fall squarely within its wheelhouse, and would no doubt act as an important selling point for its law enforcement clientele.

Advertisement

In that same email exchange, the Clearview AI employee also claimed the company was developing a method for customers to upload their own images to its app. Such a feature would effectively combine Clearview’s database with those of the local police departments it services, which carries staggering implications given the company serves roughly 2,200 organizations worldwide per a recent BuzzFeed News report. Whether this feature came to fruition also remains to be seen; an office manager with the Green Bay Police Department, Lisa Wachowski, told OneZero that the department never ended up uploading its mug shots to the app.

[OneZero]

Advertisement

Gizmodo weekend editor. Freelance games reporter. Full-time disaster bi.

Share This Story

Get our newsletter

DISCUSSION

rvincent1960
Times up, time to leave!

“the company was developing a method for customers to upload their own images to its app”

This clusterfuck just gets worse and worse. How on earth do they validate their data? So far it’s built on images and data scraped from public social network sites and now they are looking for users to upload data themselves? WTF??? The social site stuff is likely full of errors and deliberate misinformation combined with old and/or doctored (enhanced or beautified) photos. User uploads has to trust that their “client” data is valid and not being maliciously introduced into their system.

There is a programmers acronym that goes back to the beginning of the computer age, GIGO. Garbage In, Garbage Out. This bag of shit is going to be about as reliable as most of the trash on the internet.