Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Government Agency Known for Its Bonkers Tech Now Tasked With Solving Fake News

Illustration for article titled Government Agency Known for Its Bonkers Tech Now Tasked With Solving Fake News
Photo: Getty

The government agency known for developing wild technology like a penny-sized vacuum, enigmatic sky balloons, and a shit-ton of robots is now tasked with solving perhaps the greatest threat to democracy—misinformation.

Advertisement

The Department of Defense’s Defense Advanced Research Projects Agency (DARPA) “is soliciting innovative research proposals in the area of semantic technologies to automatically assess falsified media,” according to an announcement posted by the agency on August 23. The program, called Semantics Forensics (SemaFor), effectively wants to figure out an automated system that will identify and defend against disinformation campaigns, including text, audio, image, and video content.

According to the SemaFor announcement, the program wants to develop a bunch of algorithms that will analyze these coordinated attacks. The text specifically details three types of algorithms—a semantic detection one that would figure out if media is generated or manipulated, an attribution one that would conclude whether media came from a certain organization or person, and a characterization algorithm that would determine if media was created or manipulated with malicious intent.

Advertisement

DARPA’s call for proposals makes it clear that they aren’t looking for new spins on old tricks—the agency wants research that explores a new approach to defending against misinformation, not, as it states, “evolutionary improvements to the existing state of practice.”

And while an automated model sounds nice in theory, in execution, to date, these types of algorithmically-generated systems are still flawed and biased, and in more disturbing cases, outright discriminatory. Existing applications do not inspire a lot of faith in a near-future system that would be both effective and just.

It’s not inherently bad that the government wants to funnel resources into developing a unique system to prevent the types of coordinated attacks that have enabled the likes of election interference, dangerous conspiracy theories, and genocide. But it’s a bit strange that the agency most famous for its mostly inapplicable pipe dream-like technology is the one charged with figuring out an essential, albeit complex, solution to an increasingly pervasive societal problem.

Share This Story

Get our newsletter

DISCUSSION

bkilburn
ArtistAtLarge

A system already exists. They are called researchers. And some even have websites that post the truth to everyday news, like Snopes and Politifact and Rolling Stones and Mother Jones, and Frontline, etc

But anything to avoid paying people, right?