A worker connects IBM Intelligent Cluster modules, including servers and data storage devices, of a Data Center at the IBM.
Photo: Sean Gallup (Getty)

The artificial intelligence craze isn’t just hitting Silicon Valley—the Justice Department wants to get in on the action, too.

The agency announced today that it will put $2 million towards research on AI, which it believes could be used to fight human trafficking, illegal border crossings, drug trafficking, and child pornography.

Advertisement

National Institute for Justice, the DoJ’s research wing, is funding the initiative in the hopes that it will help address the opioid crisis and fight crime by helping investigators sift through massive amounts of data.

“Crimes such as gang violence, migrant smuggling, and human and opioid trafficking generate volumes of data resulting from the use of various communications and social media technologies by gang members, traffickers, smugglers; and financial transactions related to illicit activities,” NIJ wrote in its announcement. “NIJ seeks proposals for R&D projects that bring advances in AI technologies to bear on the analysis of these data, in order to provide investigative leads to enable law enforcement agencies in the United States.”

NIJ also wants to fund research on detecting encrypted child pornography files without breaking encryption, according to its call for proposals.

Advertisement

“Encryption poses a major challenge to law enforcement in its efforts to combat child pornography,” the announcement states. “NIJ seeks proposals for R&D projects that examine the potential for developing technologies that can distinguish a contraband file through its encrypted container—without breaking encryption—with a sufficient degree of certainty to support probable cause for a court order to unlock the device, based on the encryption pattern of a particular file type.”

The DoJ research initiative isn’t the first time law enforcement officials have tried to leverage AI. The British government recently announced that it had developed an AI tool to detect terrorist videos, and police forces have contracted with companies to run machine learning analysis on body camera videos.

Privacy advocates have warned that AI could be abused by law enforcement agencies—it’s difficult to keep bias from creeping into algorithms, as ProPublica recently documented in software designed to predict recidivism.

Advertisement

[PrivacySOS]