I wrote about the FBI’s attempt to force Apple to write an iPhone hacking tool for the bureau over at Time last week — and go read that if you’re getting caught up on the case — but we’ve had some added developments over the weekend worth noting. Apple has explained its position in a bit more detail, while the Justice Department filed a motion to compel Apple’s compliance and FBI Director James Comey penned a brief blog post at Lawfare arguing that the Bureau isn’t looking to “set a precedent” or “send a message” or “break anyone’s encryption or set a master key loose on the land” — only provide justice to victims of a horrific shooting. That’s a message the government’s lawyers seek to hammer home at some length in the motion to compel: They don’t want some master key that could be used to unlock any phone. They just want a little bit of code that’s uniquely tethered to this one device and wouldn’t work on any others, which Apple is free to keep in its own headquarters and erase after use. So is Tim Cook just fearmongering when he claims this would require them to create a more generally exploitable tool?

Not if you understand what the realistic consequences of the government’s request are. First, as iOS security expert Jonathan Zdziarski observes, even if we’re thinking exclusively about this case, standard forensic practice would require the code of any forensic tool Apple produces to be preserved at least for as long as it might be needed as evidence in court. Maybe that’s not such a big deal: Source code to enable brute force attacks on iOS is already out there in the form of frameworks like MobileKeyBag — it’s not of much use unless you can get the iPhone processor to actually run it, which requires either exploiting a flaw in the secure boot chain (which is how “jailbreaking” works) or having code signed with Apple’s private key. If you can do the former without wiping the key material you need on the device, this is largely moot, so the additional risk here comes from the existence of that signed code — and, in the longer term, of a process for routinely signing such code.

DOJ wants to downplay that risk, because they say Apple can ensure the signed custom boot ROM they want to load is designed to only work on this one specific device. Understand, however, that the way you’d do this isn’t really by building a “key” that only works in one lock. You have to design a skeleton key, then effectively cripple it so it won’t work in any other locks. But this creates a new attack surface for an adversary who’s able to obtain one of these device-specific pieces of software. Previously, you had to attack the authentication process on the phone to get your own code, unsigned by Apple, to load. With this signed code in hand, you’ve got the potentially much easier task of just circumventing the part of it that prevents it from running on other devices. The simplest ways of doing this would be relatively easy to get around. You could, for instance, write it to check some specific hardware ID number and stop loading unless it matches what you’ve coded. But someone with physical access to a device could feed it false information, “spoofing” the ID from the device the software was built to run on. Writing the code is the easy part—they can probably just tweak and sign tools already in the wild.  Guaranteeing that a crippled one-device key can’t be un-crippled and turned back into a skeleton key is the harder part. There are more complicated and sophisticated methods Apple might be able to use to tether their hacking tool to a specific device, which would be more difficult to circumvent—we are admittedly bumping into the limits of my technical understanding of Apple’s security architecture here—but then we run into the problem of whether it scales securely.

Loudly as the Justice Department protests that this dispute is simply about one particular phone, that’s fairly clearly not the case. Forget other even more dangerous ways Apple could be compelled to use their private key and let’s stay focused on breaking iPhones for the moment. The Manhattan DA’s office alone has at least 175 iPhones that they’d like Apple to help them break into, and DOJ itself has 12 other ongoing lawsuits seeking access to iPhones. Realistically, if Apple loses here — and especially if they lose at the appellate level, which is where this is likely going given Apple’s decision to hire superstar lawyer Ted Olson for the case — they’re going to be fielding thousands of similar demands every year. As a practical matter, they’re going to need a team dedicated to developing, debugging, testing, customizing, and deploying the code used to brute force passcodes.

Now, when it comes to the Holy Grail of Apple’s security infrastructure — the private key — it’s almost certainly stored in secure vaults, on a Hardware Security Module that makes it difficult or impossible to copy the key itself off that dedicated hardware, and likely protected by elaborate procedures that have to be followed to authenticate major new software releases. If your adversaries realistically include, say, the Chinese and Russian intelligence services — and for Apple, you’d better believe it — it’s a serious enough security problem to guard against exfiltration or use of that Holy Grail private key. Doing the same for a continuously updated and deployed hacking tool is likely to be hugely more difficult. As the company explains:

Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.

The Justice Department might not intend to “set a master key loose on the land” — but the predictable consequence of mandating compliance with requests of this type will be to significantly increase the chance of exactly that occurring. And that’s an increased risk that every individual or enterprise customer relying on iOS devices to secure critical data will need to take into account.

Finally, it’s worth stressing the awkward position this puts Apple engineers in, and the contradictory incentives it generates. A loss for Apple here very quickly results in their being required to have a team of engineers in house dedicated to complying with requests to either hack phones or build and disseminate tools for government agencies to hack phones. Those may or may not be the same engineers responsible for designing and building security features for iOS devices in the first instance. As Comey notes, in support of his “just this once” argument, the hacking tool FBI wants Apple to build here is “limited and its value increasingly obsolete because the technology continues to evolve.” Now, maybe that’s right — probably this exact attack doesn’t work, or at least in the same way, on the next model of iPhone. But that sounds more like a bug than a feature.

Consider: Possibly the next iPhone simply eliminates Apple’s ability to assist in any way. But it’s hard to imagine a scenario where the designer and key — holder for a device designed to be used by normal humans can do literally nothing, at the margin, to assist an attacker. That means every improvement in device security involves a gamble: Maybe the cost of developing new ways to attack the newly hardened device becomes so high that the courts recognize it as an “undue burden” and start quashing (or declining to issue) All Writs Act orders to compel hacking assistance. Maybe. But Apple is a very large, very rich company, and much of the practical “burden” comes from the demands of complying securely and at scale. The government will surely continue arguing in future cases that the burden of complying just this one time are not so great for a huge tech company like Apple. (And, to quote The Smiths, they’ll never never do it again — of course they won’t; not until the next time.)

Under that scenario, engineers face an additional difficult tradeoff: Every design choice they make to improve device security entails, not only the foreseeable front-end costs of implementing it, but the unpredictable back-end costs of degrading that improved security, provided someone is able to think of a way Apple is uniquely situated to do so vis a vis any particular security measure. That makes it, in short, risky and potentially costly for the company to improve its own security. In an extreme scenario — think of the Lavabit case — the government may be pushed to demand more radical forms of assistance as Apple’s ability to comply is diminished. Having rendered themselves incapable of simply extracting data from the iPhone, the government has ratcheted up their demands by asking the company to build a tool to enable a brute-force attack on the passcode. Late last year, for instance, Apple increased the length of a default numeric iPhone PIN from four to six digits, which would radically increase the time required to attempt a brute force attack of this kind against a numeric passcode — at least if they want to run the attack in house instead of providing exploit code to an outside party. Instead of simply asking whether new security measures are cost-effective to implement from a user’s perspective, they’ll need to evaluate whether they justify the additional cost of being required to attack those measures.

Little wonder, then, that Comey and the FBI keep stressing that they’re seeking very narrow and limited relief, in just this one case. If that were true, then unlikely as it is that any useful data will be recovered from this phone, it would seem awfully unreasonable for Apple not to offer its voluntary assistance, this one time. Once you realize that it’s very obviously not true, and consider even just the most immediate and unambiguous near-term consequences — leaving aside the prospect of tech companies more broadly being forced to sign other sorts of exploit code — it starts to look much more like the Justice Department is the one making unreasonable demands.