Apple is still fighting with the government over whether it should create a special software to help the DOJ unlock an iPhone connected to the suspect in the San Bernardino shooting. But government officials and Apple execs agree about one key point: It’s not about one phone. This is about the future of security.
In an op-ed for the New York Times today, New York Police Department Commissioner William Bratton and NYPD Intelligence and Counterterrorism Deputy Commissioner James J. Miller admitted that what Apple has been asked to do will drive how the government demands tech companies provide access to secured devices in the future.
“The ramifications of this fight extend beyond San Bernardino,” Bratton and Miller write. The NYPD bosses say that the government’s demand boils down to restoring “a key that was available until 2014.” This is a reference to the change Apple made in 2014, when it upgraded its encryption.
While it’s easy to imagine that most government officials wish that they could time travel and convince Apple not to upgrade its security, or that Apple could somehow remotely downgrade all of its phones to iOS 7 and other software versions with weaker encryption, that’s a misrepresentation of what the DOJ is asking. It is asking that Apple create a software to work around security measures in place to protect encrypted data—and it is asking this in order to set a precedent for cooperation, not just for this one wild and rare incident.
What Apple is pushing back against is the idea that the government can ask it for something that it doesn’t have, and compel the company to create a security bypass for its software. If the DOJ can do that, Apple argues, it sets a precedent for compelling tech companies to weaken their own security.
It definitely won’t stop with one phone. The Justice Department is already seeking court orders for at least twelve other iPhones, according to a report from the Wall Street Journal.
This is a fight about how strong our devices’ security protections should be. Apple’s refusal to create security-weakening software in the San Bernardino case doesn’t mean the government has no other ways to break the security on the device. It certainly makes it more difficult, just as it’s harder for police to knock down a locked door than it is to get the lock manufacturer to develop a special key to open it. Bratton’s op-ed makes it clear that the government prefers device security that’s easier for it to crack—he’s unapologetic about wanting a “front door” entrance. (The op-ed’s title: Seeking iPhone Data, Through the Front Door.)
What it doesn’t do is offer any compelling reasoning why Americans should embrace shitty phone security as a trade-off for making the execution of search warrants easier. Nobody wants to see terrorists get away, but framing this as a choice between terrorists winning and normal people having secure phones is a slimy move. That’s not the choice. The government is attempting to conscript third-party security providers to destroy their own services as a shortcut to executing a search warrant. The choice is whether that conscription is fair and legal—or an undue burden that will undermine security.
“Google, which owns the Android system, now indicates that it will follow Apple’s lead,” Bratton and Miller write. “For those companies, and others like them, there is a sound argument in not wanting, even indirectly, to become an arm of the government. But when you are the two companies whose operating systems handle more than 90 percent of mobile communications worldwide, you should be accountable for more than just sales.”
Apple, Google, and other companies are certainly accountable for more than just sales. But arguing that these companies have a responsibility to perform what amounts to police work—that they should break protections they’ve created for customers at the whim of the government—is absurd.