
A face recognition app used by thousands of law enforcement agencies, which has drawn considerable scrutiny in past weeks over its creator’s dubious data collection methods, contains code hinting at an unreported range of potential features, based on a version of the app discovered by Gizmodo.
Reporters were able to download the most recent Android version of the app marketed to police by Clearview AI, the New York-based startup whose controversial scraping of an estimate three billion photographs from the likes of Facebook, Google, and YouTube have prompted legal threats from major tech companies and alarmed privacy hawks on Capitol Hill.
The app, which will not access Clearview’s face recognition system without a login, was found on an Amazon server that is publicly accessible. Information stored in S3 buckets, such as the one containing Clearview’s app, is usually set to private by default. The version Gizmodo obtained does not come in the user-friendly form one might find in the Google Play Store. Instead, it is a file type native to Android apps, known as an APK. Using it, reporters were able to download the file and install it onto an Android device.
While not all of the app’s activity can be observed without a user account, reporters inspected data being sent to Google Analytics, Crashlytics, and App-Measurement, three companies that record basic details about any mobile devices running the app and tell Clearview whether the app is running smoothly. The app also grants access to Android’s Fine Location API, which determines the most precise location possible from available location providers, including the Global Positioning System (GPS) as well as wifi and mobile cell data.

Other bits of code appear to hint at features under development, such as references to a voice search option; an in-app feature that would allow police to take photos of people to run through Clearview’s database; and a “private search mode,” no further descriptions of which are available through surface-level access.
When reporters attempted to take screenshots of the app, they received an alert notifying them: “Screenshots must not be shared. Please share links of the search results instead. Any leaked screenshots will result in suspension of your account.”
According to one file, the app appears to include a feature that allows a user to search through Clearview’s proprietary database by simply tapping on an uploaded photo. The app also contains language encouraging users to send Clearview “success stories” regarding the app’s performance. It further includes the prompt: “Invite your coworkers or other investigators to Clearview for free. Just press share below to send a link with free Clearview demo account.” Without login access, it is impossible to know if or how these apparent features function.

Other code within the app identifies the unnamed augmented-reality glasses company that Clearview could potentially partner with; a detail first unearthed by New York Times reporters while examining an earlier version of the app. The app includes instructions for installing a “companion app” designed by Vuzix, an AR and computer vision company that manufacturers smart glasses. (In a press release this month, Vuzix said its integration with another company, TensorMark, will allow customers “to identify countless facial and object images” stored in cloud databases.)
Clearview CEO Hoan Ton-That said in an email to Gizmodo that the companion app is a prototype and “is not an active product.” RealWear, another company, which makes “a powerful, fully-rugged, voice operated Android computer” that is “worn on the head,” is also mentioned in the app, though it’s not immediately clear what for.
The app also contains a script created by Google for scanning barcodes in connection with drivers licenses. (The file is named “Barcode$DriverLicense.smali”) Asked about the feature, Ton-That responded: “It doesn’t scan drivers licenses.” Gizmodo also inquired about the app’s so-called “private search mode” but did not get a response.
Ton-That emphasized that the app cannot be used without a Clearview account. “A user can download the app, but not perform any searches without proper authorization and credentials,” he said.
Got a tip you’d like to share? Contact the reporters securely using Signal at (202)556-0846 or by email: dell@gizmodo.com. You can send us documents and files and speak to our reporters anonymously and securely with SecureDrop.
Despite sitting on an Amazon S3 bucket unsecured, there is no public version of Clearview’s app, which is not available on either the Google Play Store or Apple’s App Store, nor Clearview’s website without a login.
“Clearview’s app is NOT available to the public,” Clearview says on its website. “While many people have advised us that a public version would be more profitable, we have rejected the idea. Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only.”
On Wednesday, the Daily Beast revealed a breach of Clearview’s security said to include the names of its private and public clients and the number of times they searched its database. The following day, BuzzFeed News obtained internal documents that include a long list of clients, among them the FBI, Customs and Border Protection, and Interpol, in addition to hundreds of local police departments. (The New York Times previously reported that the FBI and Department of Homeland Security were testing the product.)
In addition to more than 2,200 law enforcement agencies, BuzzFeed said, Clearview’s software had been sold to companies in 27 countries, including major U.S. retailers such as Macy’s, Walmart, and Best Buy.

Clearview responded to the breach with a statement attributed to its lawyer, saying security is the company’s “top priority” adding, “Unfortunately, breaches are a part of life in the 21st century.”
Democratic Senators Ed Markey and Ron Wyden fired back at the response, with Markey calling the statement “laughable.” Wyden said by email that “shrugging and saying data breaches happen is cold comfort for Americans who could have their information spilled out to hackers without their consent or knowledge.” Wyden’s staff previously reached out to Clearview to request a demonstration. At first, the company said yes. But as of Thursday, it had rescheduled multiple times.
FCC Commissioner Geoffrey Starks said the breach raised doubts as to whether Clearview could be trusted with such a massive amount of personal data. Regardless, facial recognition, he said, raises “serious issues of privacy and civil liberties, particularly when it comes to communities of color.” “How we can trust a company with massive privacy responsibilities when it can’t even protect its own corporate data,” he asked.
A few police officials told the Times that Clearview’s product appeared far superior to its competitors, with one asserting its algorithm accepts “photos that aren’t perfect.” The same officer told the paper he’d run photos from old cold cases through the app and identified more than 30 suspects. But face recognition’s dependability has long been called into question by academics and publicly funded research into the technology’s limitations.
A study of 189 facial recognition systems conducted by a branch of the U.S. Commerce Department last year, for instance, found that people of African and Asian descent are misidentified by software at a rate 100 times higher than whites. Women and older people are at a greater risk of being misidentified, tests showed.
While championing a moratorium on police face tech, the American Civil Liberties Union last year drew attention to the case of Willie Lynch, a Black man arrested and charged in Florida with selling drugs on the recommendation of a face algorithm. Lynch, one of several possible matches, was prohibited from challenging the algorithm in court, even though the program police relied on to obtain his identity expressed low confidence when it paired Lynch’s photo with the suspect’s.
The system, which the ACLU said was being used 8,000 times per day, uses stars to rate the quality of the match. Lynch received one star.
“Countless studies indicate that facial recognition is unreliable technology, that it doesn’t accurately identify people with darker skin complexions—especially women—and so we know that this technology will impact Black and brown communities in particularly dangerous ways,” Myaisha Hayes, national organizer on criminal justice and tech at MediaJustice, said at the time.
Compounding matters, there is little oversight when it comes to holding the country’s 17,000 police departments accountable for any misuse of confidential databases constantly sucking up facts—not to mention misinformation—about people’s private lives. Malfeasance is not uncommon. In 2016, the Associated Press unearthed reports of database misuse across the country, with officers regularly accessing confidential law enforcement databases to get information on “romantic partners, business associates, neighbors, journalists and others for reasons that have nothing to do with daily police work.”
Between 2013 and 2015, officers who misused law enforcement databases were fired, suspended, or resigned more than 325 times, according to the AP. More than 250 times, officers were reprimanded, received counseling, or lesser discipline, it found.
Facebook, Google, and Twitter each served Clearview with a cease-and-desist letter this month, asking the company to halt the scraping of their users’ personal data, which Clearview’s Ton-That has defended by comparing his company to Google. “You have to remember that this is only used for investigations after the fact. This is not a 24/7 surveillance system,” said Ton-That, who argued his company had a First Amendment right to collect data Americans make public on social media and sell access to it for law enforcement purposes. “The way we have built our system is to only take publicly available information and index it that way,” he said.
Alex Joseph, a YouTube manager, fired back: “Most websites want to be included in Google Search, and we give webmasters control over what information from their site is included in our search results, including the option to opt-out entirely. Clearview secretly collected image data of individuals without their consent, and in violation of rules explicitly forbidding them from doing so.”
Clarification: A previous version of the article implied the New York Times learned about one of the app’s possible features—integration with augmented-reality glasses—from Clearview. Times reporters actually unearthed that detail while examining an earlier version of Clearview’s app. The article was updated to credit the Times.