Show sidebar

A friendly critique of the proposed Chesney/Vladeck “middle ground” in the Apple/FBI disputes

Bobby Chesney and Steve Vladeck have published an excellent and provocative post today concerning the recent All Writs Act disputes between the government and Apple. I agree with much of what they write. Most importantly, I agree with Bobby and Steve that the cases should, and in all likelihood will, turn on a careful, case-by-case assessment of “the burden the government’s requested relief would impose upon the third-party recipient” — a calculus that might vary considerably among different sorts of phones and operating systems.

I wonder, however, whether the “stable line” they propose is the right place for the courts to draw the relevant distinctions. As Bobby and Steve would have it, the AWA should be construed to authorize courts to compel companies “to utilize existing capabilities or software vulnerabilities” to assist the court and the government in effecting valid orders; but the AWA would not authorize “orders to companies to design and create materially new [capabilities or vulnerabilities].”

Requiring application of existing capabilities > permissible.

Requiring the creation of new capabilities > outside the scope of the AWA.

This sort of a existing/new distinction would, perhaps, create a “a stable line for forward-looking application of the All Writs Act in similar cases,” and it has a great deal of surface appeal. Indeed, it’s a central theme of Apple’s briefs in the California case, and of many of the briefs of its amici. The ACLU’s amicus brief, for example, begins with a similar distinction that seems to be resonating with a lot of observers: “The order the government seeks would be unprecedented. The government is not seeking evidence in Apple’s possession or control, as would be consistent with precedent under the All Writs Act. Rather, it seeks to compel Apple to create and authenticate software that would allow the government to break into an individual’s iPhone . . . .”

Perhaps Apple is even correct when it argues, in its reply brief, that an order to create new software is outside the scope of the AWA because that statute does not authorize courts to make use of “entirely new writs with no common law analog.” (I haven’t done the research to determine whether, as Apple suggests, the AWA does not extend to writs that lack a common-law analog and, if so, whether an order to “create a new tool” has a common-law analog. The parties’ briefs are not very helpful on those questions.)

But here’s where things get tricky: The government’s request that Apple create new software, and then effectively install such software on Farouk’s iPhone by sending it as an automatic update, so as to disable the “auto-lock” feature on the phone (which would then presumably enable the FBI itself to unlock the phone by a “brute force” search for the password), is in effect only a fallback request, made for Apple’s benefit.

As far as the government is concerned, Apple does not have to “create” anything “new” at all, if it does not wish to do so. The government would be happy to do the creating itself, using its own creative and industrious cryptographers. Yet the government is not doing the new work itself; instead, it requests that Apple be ordered to create and implement the anti-autolock software. Why?

The answer is rather simple: It’s not a matter of relative expertise or creativity. Instead, it’s that the anti-auto-lock software can’t be created without Apple’s private “key,” without which Apple’s devices will not run software. “[T]he FBI cannot itself modify the software on Farook’s iPhone,” explains the government, “without access to the source code and Apple’s private electronic signature.” Apple concurs: As one of its declarants has stated, “[w]e agree with the government that the system requires Apple authentication.”

So why doesn’t the government simply ask Apple to hand over the private key—which is, after all, something already created and in Apple’s possession or control—and then do the work itself to create the necessary software? That demand for a “key” would be akin to a traditional and ubiquitous sort of order under the AWA; in the words of the ACLU, it would “be consistent with precedent under the All Writs Act.” And, importantly, it would appear to satisfy Bobby and Steve’s test: it would merely be providing the government with an “existing” capability!

As the government notes, however, it has not sought to compel Apple to turn over the existing source code “because it believed such a request would be less palatable to Apple.” And, of course, that’s absolutely right, because the “key” in question is what Julian Sanchez calls “the Holy Grail of Apple’s security infrastructure,” something that Apple “almost certainly [has] stored in secure vaults, on a Hardware Security Module that makes it difficult or impossible to copy the key itself off that dedicated hardware, and likely protected by elaborate procedures that have to be followed to authenticate major new software releases.”

No doubt Apple would prefer to create the new software, rather than to turn over what even the government calls the “keys to the kingdom.” Which is entirely reasonable, because (as Apple puts it) turning over the key could have “catastrophic security implications.” If that’s Apple’s preference, however, there’s something at least a bit disingenuous, or at least disconcerting, about Apple and its amici placing so much reliance on the idea that an order requiring the application of labor to create something new is “unprecedented” and inconsistent with the All Writs Act.

For this reason, I doubt that Bobby and Steve’s existing/new distinction can do the work they hope it would.

Nevertheless, there is something in their post that does, I think, point the way to the important—but thus far unanswered—factual questions that should be central to the critical “burden” calculus in these cases.

Bobby and Steve write that “everyone seems to agree” that the tool the government is seeking to compel Apple to create in the San Bernardino case “would amount to a significant new vulnerability in iOS 9.x” — for many people’s iPhones, not only Farouk’s — “were it to escape into the wild.” They also write that Apple would therefore be required to expend “considerable effort . . . for an indeterminate (but no doubt substantial) length of time either to protect the tool (if kept) or to destroy and recreate it again and again (as a wave of similar applications inevitably would follow).”

Bobby and Steve are absolutely right that this is the nub of the case. Alas, contrary to their suggestion, there is no consensus on it. The parties strongly dispute each of the two propositions.

The government, for its part, argues both that Apple could readily protect the new code and that the sky would not fall if somehow that code fell into the wrong hands. As to the first point, DOJ writes:

[C]ontrary to Apple’s stated fears, there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession. Nothing in the Order requires Apple to provide that code to the government or to explain to the government how it works. And Apple has shown it is amply capable of protecting code that could compromise its security. For example, Apple currently protects (1) the source code to iOS and other core Apple software and (2) Apple’s electronic signature, which as described above allows software to be run on Apple hardware. (Hanna Decl. Ex. DD at 62-64 (code and signature are “the most confidential trade secrets [Apple] has”).) Those—which the government has not requested—are the keys to the kingdom. If Apple can guard them, it can guard this.

And as to the “speculative harm” point, DOJ writes:

Even if “criminals, terrorists, and hackers” somehow infiltrated Apple and stole the software necessary to unlock Farook’s iPhone (Opp. 25), the only thing that software could be used to do is unlock Farook’s iPhone. (Perino Decl. ¶¶ 6.a, 18-24.) Far from being a master key, the software simply disarms a booby trap affixed to one door: Farook’s. The software “will be coded by Apple with a unique identifier of the phone so that the [software] would only load and execute on the SUBJECT DEVICE [i.e., Farook’s iPhone].” (Order ¶ 3.) This phone-specific limitation was not dreamed up by the government, but instead employs Apple’s well-publicized security paradigm. A “unique ID (ECID)” associated with each physical iPhone is incorporated into the phone’s operating system. (Perino Decl. ¶ 20; Hanna Decl. Ex. K at 6.) “Adding the ECID ‘personalizes’ the authorization for the requesting device.” (Id.) Apple has designed its phones so that every operating system must pair with the phone’s ECID. (Perino Decl. ¶¶ 18-24; Hanna Decl. Ex. K at 6 (describing how the Apple server “adds the ECID” before it “signs” the iOS to be used for the upgrade).) The operating system and ECID must correspond for the operating system to work. The ordered software would rely upon the same limitation. Apple implies that the code could be modified to run on other phones, but a second Apple security layer prevents that from happening: Apple devices will only run software that is electronically “signed” by Apple. (Hanna Decl. Ex. K at 6 (“only Applesigned code can be installed on a device”).) “Signing” the software described in the Order will not release Apple’s signature to the government or anyone else—Apple signs all publicly available iOS software, but that does not disclose the signature itself. (Perino Decl. ¶¶ 9, 13-17, 24, 28.) And if the code were modified to run on a phone with a different ECID, it would lack a valid digital signature. Without that signature, the code would not run at all on any iOS phone with intact security. (Id.) Thus, it is simply not plausible that Apple’s complying with the Order would cripple iPhone security.

Not surprisingly, Apple disagrees on both counts. As to the risk of loss of control:

The government’s assertion that “there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession” (Opp. 24), simply shows the government misunderstands the technology and the nature of the cyber-threat landscape. As Apple engineer Erik Neuenschwander states:

“I believe that Apple’s iOS platform is the most-attacked software platform in existence. Each time Apple closes one vulnerability, attackers work to find another. This is a constant and never-ending battle. Mr. Perino’s description of third-party efforts to circumvent Apple’s security demonstrates this point. And the protections that the government now asks Apple to compromise are the most security-critical software component of the iPhone—any vulnerability or back door, whether introduced intentionally or unintentionally, can represent a risk to all users of Apple devices simultaneously.” [Neuenschwander Supp. Decl. ¶ 28.]

And, as to the harm that might occur if the software does leak out, Apple writes:

The government is also mistaken in claiming that the crippled iOS it wants Apple to build can only be used on one iPhone:

“Mr. Perino’s characterization of Apple’s process . . . is inaccurate. Apple does not create hundreds of millions of operating systems each tailored to an individual device. Each time Apple releases a new operating system, that operating system is the same for every device of a given model. The operating system then gets a personalized signature specific to each device. This personalization occurs as part of the installation process after the iOS is created.

Once GovtOS is created, personalizing it to a new device becomes a simple process. If Apple were forced to create GovtOS for installation on the device at issue in this case, it would likely take only minutes for Apple, or a malicious actor with sufficient access, to perform the necessary engineering work to install it on another device of the same model.

. . . [T]he initial creation of GovtOS itself creates serious ongoing burdens and risks. This includes the risk that if the ability to install GovtOS got into the wrong hands, it would open a significant new avenue of attack, undermining the security protections that Apple has spent years developing to protect its customers.” [[Neuenschwander Supp. Decl.] ¶¶ 17–19.]

Cybersecurity experts agree. E.g., . . . Ex. OO (quoting former NSA expert Will Ackerly: “[u]sing the software even once could give authorities or outsiders new clues to how Apple’s security features work, potentially exposing vulnerabilities that could be exploited in the future”); Ex. EE at 5 [Rep. Conyers, Encryption Hr’g] (“The technical experts have warned us that it is impossible to intentionally introduce flaws into secure products—often called backdoors—that only law enforcement can exploit to the exclusion of terrorists and cyber criminals.”); Dkt. 82 [Experts’ amicus brief] at 10 (the government’s proposed safeguards “are not meaningful barriers to misuse and abuse of the forensic capabilities this Court is ordering Apple to create”); id. at 18 (“A signed firmware update that is not truly limited to a single device, even one created for legitimate forensic purposes, becomes like a ‘skeleton key’ for the entire class of devices.”).

I should emphasize that I have no idea which side is correct on either of these empirical questions. This technical stuff is, to say the least, far outside my wheelhouse. Yet I’m fairly confident that it is precisely these questions that will, or should, determine the nature of the burden on Apple, in this and other cases—and that ought to determine whether the writs should issue under the AWA.

Unfortunately, this also means that there isn’t any nice, easy-to-administer bright-line rule, as between compelling production of “new” and “existing” things, that can point the way to the proper answer in each case. As Bobby and Steve themselves acknowledge, “courts will have to assess both the quantifiable and unquantifiable impact of the vulnerability that the order compels the recipient to develop, and the extent to which such a capability departs in both degree and kind from existing vulnerabilities in the same product.” Those impacts might vary considerably from case to case, whether or not the software or other things in question are new or (as in the case of the “holy grail” Apple signature) they already exist.

On this much, we surely agree: Assessing the burden on Apple—which is largely a function of what it would take to reduce any serious risk of vulnerability of other phones, those not subject to lawful court orders—will indeed require, in each case, “the creation of a detailed record that would allow a magistrate judge to have as much information as possible.” I hope that is the sort of record that the parties begin to create for the benefit of Magistrate Judge Pym at tomorrow’s hearing.

Tags: , ,

About the Author

is a professor at the Georgetown University Law Center. You can follow him on Twitter (