FBI Director Christopher Wray participates in a question-and-answer session while arguing for the renewal of Section 702 of the Foreign Intelligence Surveillance Act at the Heritage Foundation, October 13, 2017 in Washington, DC. (Photo: Getty)

Never in the past century has the FBI ever had greater access to consumer data than it does today. It has never been easier to locate a person of interest—be it a terrorist, a counterfeiter, or a child predator. The quickening pace of technology has given rise to new forms of surveillance, the likes of which, two decades ago, no federal agent would’ve ever thought possible.

Yet, for years now, the FBI has argued just the opposite, painting for the American public a picture of its agents fumbling around in the shadows without a flashlight, hindered by privacy-enhancing consumer technology, helping countless criminal suspects and terrorists evade arrest. Despite a steady year-to-year increase in search warrants, subpoenas, and National Security Letters served on technology companies, the bureau assures us that its investigations are “Going Dark.”

“To put it mildly, this is a huge, huge problem,” FBI Director Christopher Wray said Sunday during a speech at the International Association of Chiefs of Police conference in Philadelphia. “It impacts investigations across the board—narcotics, human trafficking, counterterrorism, counterintelligence, gangs, organized crime, child exploitation.”

Advertisement

Over an 11 month period, Wray said, federal agents have found themselves unable to access the content of more than 6,900 mobile devices. “I get it,” he said, “there’s a balance that needs to be struck between encryption and the importance of giving us the tools we need to keep the public safe.”

Today, the “Going Dark” debate is exemplified best by what readers may recall as “FBI vs. Apple.” In early 2016, the Justice Department fought to force Apple to unlock the encrypted iPhone 5C of Syed Farook, one of the perpetrators behind the 2015 San Bernardino attack, in which 14 people were killed. The court battle was averted, however, after the FBI employed an unidentified vendor to unlock the phone. (A federal judge ruled earlier this month that FBI was under no obligation to reveal the vendor’s identity, citing a need to protect its methods from disclosure.)

The FBI has long argued that it actually supports the use of “strong encryption.” That is patently false, of course. The bureau vehemently opposes the adoption of technological architectures to inhibit third-party access to communications, such as end-to-end encryption. Instead, it has argued in favor of what’s commonly called a “backdoor” into consumer devices—a mythical gateway into encrypted communications that only the federal government can access.

Advertisement

More than two years ago, the FBI quietly removed advice from its website that had urged consumers to use encryption. It later claimed it had no record of ever doing so.

A study of the so-called “Going Dark” problem completed by the Berkman Center for Internet & Society at Harvard University last year found the FBI’s arguments mostly specious. Many tech companies, it said, would be unlikely to allow users to truly obscure their data, since the companies themselves rely on access to it for revenue streams and product functionality (e.g., data recovery). The researchers also found that software ecosystems were entirely too fragmented for encryption to become both “widespread and comprehensive.” For that to happen, “far more coordination and standardization than currently exists would be required.”

“Networked sensors and the Internet of Things are projected to grow substantially, and this has the potential to drastically change surveillance,” the researchers said. “The still images, video, and audio captured by these devices may enable real-time intercept and recording with after-the-fact access. Thus an inability to monitor an encrypted channel could be mitigated by the ability to monitor from afar a person through a different channel.”

The study, titled “Don’t Panic,” notes that locational data from cellphones and other devices, telephone calling records, header information in emails must all be unencrypted in order for the systems that control these methods of communication to operate. “This information provides an enormous amount of surveillance data that was unavailable before these systems became widespread,” the study says.

The imagery that accompanies a “backdoor” seems harmless enough; but it only seems that way. There is no conceivable method for creating a secure portal to consumer data that can be accessed and controlled solely by the FBI. It is purely magical thinking. The US government is itself awful at protecting secrets; the US intelligence community in particular leaks like a sieve, its data stolen or hacked on a regular basis. Highly guarded NSA and CIA programs are routinely exposed online.

If you look beyond the FBI’s colorful metaphors, what the bureau really wants for consumer is not “strong encryption,” but weak encryption—encryption with inherent flaws that make it easy to defeat. The irony, of course, is that without strong encryption, consumer data would be even more vulnerable to the types of criminals the FBI claims it is trying to stop.

To wit, encryption is a process whereby messages are encoded through the use of mathematical algorithms. Some stand up to attacks better than others. But by design, only the intended recipient of the message should be capable of deciphering it. Therefore, what the FBI really wants is for Americans companies to use lousy math.

Advertisement

There exists a huge gap between what the bureau wants and what is practicable. Tech companies are simply never going to wittingly employ a system that leaves its customers—and thereby its own brand—open to attack.