Update: The FBI is now explicitly denying that the method described in this post is the one they’re planning to employ — so apparently my suspicion was mistaken and they may well be employing a truly novel technique. The more general point, I think, is still valid: The relative speed with which an outside firm was able to demo a solution once the case hit the headlines should raise legitimate questions about how serious an independent effort FBI made before claiming “necessity” and turning to compulsion to access the phone. Manifestly someone out there has the capability, meaning this protracted and costly lawsuit could have been avoided — and the phone cracked weeks or months ago — had they only approached the right parties for assistance. Original post follows.

In a third-act twist worthy of M. Night Shyamalan, the FBI has announced that it has just discovered a method, provided by an unnamed “third party,” of breaking into deceased San Bernardino shooting suspect Syed Farook’s iPhone without help from Apple. As a result, the hearing at which Apple and DOJ lawyers were scheduled to square off today has been postponed for at least two weeks while the Bureau tests out this “new” approach, potentially rendering the legal battle with Cupertino moot.

The scare quotes in the previous sentence are there to signal my skepticism that there is a genuinely novel technique in play here — which matters because the FBI has been consistently representing to the courts that Apple’s assistance, and an order to compel that assistance, was “necessary” to access the data — which is to say, that the FBI had no viable alternative methods to decrypt the contents of the phone. Yet from the beginning of the public debate over this case, the technical experts I talked with consistently pointed to two distinct approaches the Bureau might employ that wouldn’t require Apple to write or authenticate a line of code.

First, there are potential methods of extracting the phone’s UID — a secret master encryption key, unique to each device, physically embedded in its processor chip. With that key, which is designed to be difficult to read and unknown even to Apple, the FBI could crack the encryption protecting the iPhone data in a matter of minutes. Though cumbersome, time-consuming, and expensive, these methods would almost certainly still be cheaper than a protracted legal battle with a deep-pocketed tech titan — though they would also inherently carry some risk of destroying the key information, rendering the iPhone data permanently inaccessible.

The second and more plausible method was described in some detail weeks ago by ACLU technology fellow Daniel Kahn Gillmor, and even referenced by Rep. Darrell Issa at recent hearing with FBI director James Comey. Read Gillmor’s post for the details, but in essence it involves removing the phone’s “effaceable storage” to make a backup copy of the key material that is erased to render the phone’s data permanently inaccessible after too many incorrect passcode guesses. When FBI hits their guess limit, they “re-flash” the backed-up data to the phone and get another round of guesses. Security researcher Jonathan Zdziarski argues cogently that this is the most probable option.

If that’s the case, the Bureau ought to have some explaining to do, because this alternative surely should not have been unknown to FBI’s forensic experts. If we’re uncharitable, we might suspect the FBI of being less than forthcoming with the court about a range of feasible alternatives they should have been aware of. If we’re more charitable, then at least it seems as though they did not make a very serious effort to explore alternatives before pleading “necessity.” A high profile terrorist attack must have seemed like an ideal test case for the proposition that technology companies can be compelled, under existing law, to hack their own security on the government’s behalf — which might have sapped enthusiasm at Main Justice for abandoning it in favor of an attack that would give them this data, but be unlikely to work on newer model phones. Of course, that cost-benefit calculus might look different once it became clear that this would be a long legal slog, with Silicon Valley more generally lining up to back Apple — not a quick and easy PR win for the government. No doubt the FBI will plead reluctance to disclose too much about their “sources and methods” of accessing data on the phone, but they should at least be under some pressure to confirm, generally, whether they’re using some variant of an approach they ought to have known about well before this past weekend. If so, that ought to affect the credibility their representations of necessity are afforded by future courts in similar cases.

And, of course, there will be no shortage of similar such cases: There are a dozen underway already, and hundreds more locked iPhones in the hands of various law enforcement agencies. Since the method outlined above will (probably) not work on newer iPhones, the underlying legal questions raised by this case will still need to be resolved—though perhaps by courts that have learned to regard FBI’s technical affidavits with bit more skepticism.