Monday night, on the eve of its courtroom showdown with Apple over a locked iPhone, the FBI surprised everyone. In a motion filed with the court, it asked for more time to explore a new possible method it claimed it had come across to unlock the iPhone at issue — the work phone of one of the San Bernardino shooters. If the method works, the FBI will no longer need Apple's assistance to unlock the phone.
What does this mean for the fight over whether the government can force Apple to write new software to help the FBI break into an iPhone?
First and foremost, it makes it hard to trust the technical expertise of the FBI. The FBI had previously claimed in filings with the court and in a hearing before the House Judiciary Committee that it couldn’t get into the San Bernardino iPhone on its own. It insisted that the only way to break into the phone was to force Apple to write new software weakening the security protections on the device.
We have already explained that a key premise of the government's argument — that it would lose the data if it tried to guess the passcode too many times — was false. And now the FBI is acknowledging that its previous statements that only Apple could help may also have been wrong.
This doesn't inspire confidence, and it is yet another reason to resist the government's demands in the larger debate about whether tech companies should be forced to weaken the encryption in their devices to provide for governmental access. There is an extraordinary consensus among security professionals that doing so would be disastrous for security. The FBI has responded by wishing away the consensus of the technical community. The latest development in Apple's case gives little reason to think that the FBI has the technical qualifications necessary to make this point.
Second, the legal fight is far from over. Even if the FBI gets access to the San Bernardino phone using the new method it is exploring, it is inevitable that the FBI will come knocking again. We know of a dozen or so other apparently pending requests that Apple has received from the government for technical assistance — including a case in New York, in which the government recently appealed a ruling against it. (It’s unclear if the technique that the FBI indicated could help it unlock the San Bernardino phone would apply in these other instances. If it does, that could moot those requests.)
Meanwhile, Apple and other tech companies are bolstering the security of their devices in response to consumer demand and the new threats to security cropping up every day. The FBI will eventually run into a phone that it can't unlock on its own, and it will likely seize the opportunity to go after the legal precedent it sought in this case.
Finally, expect the debate in Congress to heat up. The top Republican and Democrat on the Senate Intelligence Committee, Richard Burr (R-N.C.) and Diane Feinstein (D-Calif.), are reportedly circulating draft legislation that would allow judges to order companies like Apple to provide backdoor access to encrypted communications. Thus, the momentum in Congress continues despite delays in court. And we have yet to see the Obama administration take a firm stand to defend companies like Apple and their right to ensure the security of our electronic products.
The security of all of our devices is compromised in a world in which the FBI — or any other government around the world — can turn the tech companies against their users. Users need to be able to trust tech companies to refrain from pushing government-mandated malware onto their phones. That trust is essential to the basic hygiene of our devices and of the Internet. And that's why allowing the FBI to poison the well, so to speak, is nothing short of reckless.
While Monday night’s development puts the brakes on that recklessness, it’s not likely to have solved the bigger problem. For that, we need the Department of Justice and members of Congress to abandon their attempts to undermine our security, and to instead focus on policies that encourage widespread adoption of strong encryption.