Which would you prefer: keeping your valuables in a locked safe, or keeping them in a shoebox and trusting that everyone will adhere to laws against theft and their concomitant penalties? Most, if not all, of us will choose the former. That’s so even if we realize that safe-crackers may ultimately find a way someday to bust open even the most top-of-the-line safe currently on offer.

Yet in its new policy report on encryption, the East-West Institute has announced a policy preference for a regime that requires us all to use digital shoeboxes instead of safes, and to simply trust that the police — and all the burglars out there — will behave themselves, because there will be rules about whether, when, how, and by whom it’s OK to raid the shoebox.

The report’s over-reliance on trust in the government is one of many insights in this great review of EWI’s report on Lawfare by my Stanford colleague Herb Lin. As the report’s acknowledgments note, Herb and I both provided comments and feedback to EWI during the report’s development. But (as Herb predicted) I disagree strongly with its conclusions as well as its underlying assumptions. I don’t think the report is focusing on the right question, and although its aim is to help move the discussion around encryption forward, I believe it keeps that discussion stuck in the past.

As Herb points out, EWI’s report “assumes away some of the most important concerns of those in privacy and technology communities” when it comes to proposals to regulate strong encryption. (I’m using the term “strong encryption” in the sense the privacy community uses it, as elucidated by Herb: “encryption with zero likelihood that the ciphertext it produces can be decrypted by anyone other than the intended recipient.”) He notes that the report largely holds back from recommending the imposition on law enforcement of “hard measures”: “actions that impose legally enforceable requirements for certain visible and operational behaviors.”

In a way, strong encryption is a hard measure. That’s one of the reasons why law enforcement dislikes it and why the privacy and civil liberties community thinks it’s so important, above and beyond its benefits for data security. Strong encryption imposes an enforceable limitation on the actions of governments, and therefore on their bad acts as well as their good-faith acts — not through the law, but through technology. It ensures that we don’t need to trust in the efficacy of legal hard measures imposed on law enforcement in a world where exceptional access to plaintext is mandated. In yet another point that Herb and I agree on, it is facile of EWI to assume unflagging good faith and competence by governments, even democratic ones, and to assume away the (historically well-deserved) concern about potential abuse.

Its failure to give what I consider adequate weight to civil liberties and human rights concerns is not the only bone I have to pick with the report. I also disagree with the overall framing of the report. The document sets forth nine “normative recommendations on encryption policy” meant to help formulate “balanced” policy regimes, and proposes a supposedly comprehensive and value-neutral “analytical framework” for evaluating the pros and cons of any given policy proposal. It then suggests three different ideas combined into two proposed regimes — “design mandates”* requiring products to be built with “the capability to accommodate future lawful access requests,” coupled with compelling encryption vendors to assist law enforcement, and government hacking (again coupled with compelled assistance). The report then scrutinizes each of the two regimes, carefully considering the trade-offs and equities involved in each.

We might be led to expect some degree of objectivity from all the report’s talk about “balance,” its “analytical framework,” and its “algorithm” for evaluating policy options. A framework doesn’t prescribe outcomes; it merely instructs us in methodology. It purports to make value judgments explicit but not to adopt them.

Yet the worldview driving the report — a view that law enforcement should have (needs/deserves) exceptional access to plaintext and that exceptional access to plaintext is an indispensable prerequisite for public safety — is not neutral. EWI’s framework embeds certain values (here, ones that privilege law enforcement interests) under a veneer of objectivity, just like any other algorithm.

After setting forth its recommendations, presenting its framework, and applying it to the two proposed regimes, the report ends by advocating for design mandates over government hacking. It makes this choice in one paragraph buried in the conclusion page. By announcing a preferred policy outcome, that one paragraph effectively vitiates the 40-plus pages of supposedly neutral discussion that precede it. That is, the report purports to build a scale in which to weigh different proposals, searching for “balance”; determining the “right” policy is left as a “your mileage may vary” exercise for policymakers. The report could, and should, have stopped there, rather than take a side.

What’s more, the scale the report builds still weighs only two options against each other. Design mandates and government hacking are presented as “sample regimes” EWI considers consistent with its nine recommendations. Yet the report fails to explore any additional regimes or explain why they did not merit inclusion, thereby helping the authors to frame EWI’s preference as one of only two real options. The focus on those two regimes (and the assumed necessity of compelled assistance as a baseline requirement of any regime) obscures the fact that the universe of choices is much broader, and includes “take no action,” as the report mentions once or twice in passing.

It’s my position, and that of many technologists and privacy and civil liberties advocates, that there is no need to regulate encryption in order to deal with its alleged impact on law enforcement. That position is not taken seriously in the report. Instead, EWI buys into the politician’s syllogism: “we must do something; this is something; therefore, we must do this,” no matter how unappetizing that something may be.

Indeed, EWI itself seems to view the two regimes as an ugly Hobson’s choice, forgetting that it artificially constrained itself to that choice. The conclusion mentions the significant, intractable downsides of both selected regimes. I agree with EWI that government hacking, which exploits the unintentional vulnerabilities that unavoidably occur in even the most carefully-designed software, is a questionable practice. But the dangers of intentional design mandates have been well-understood for decades. I and others believe their downsides far outweigh their upsides. EWI, too, acknowledges how risky and “unattractive” design mandates are. It rationalizes its preference by pointing to the skullduggery inherent in the other choice, government hacking. That regime, EWI says, “risks creating unaccountable power that, as human history continues to show, is fraught with danger to the citizenry,” “no matter how much procedure, transparency and oversight is layered on.”

Incredibly, EWI’s report fails to realize that those concerns are equally applicable to “design mandates.” Design mandates require intentionally weakening everyone’s encryption for law enforcement. They’re like requiring shoebox-grade security for everybody, while government hacking is like letting safe-makers build the best safes they can, but allowing law enforcement to try to crack them. Yes, legalized safe-cracking by the police “is fraught with danger.” But mandating shoebox-grade security by design makes those risks universal by baking the potential for abuse into every device and communications service subject to the mandates (which, moreover, are still largely theoretical, whereas government hacking already happens in practice). It is difficult to understand why EWI would acknowledge that its reliance “on transparency and the rule of law” could fall short with one regime, yet treat that reliance as infallible under the other.

Even had it not expressed a policy preference, this report will help shift the Overton window for the encryption discussion. The report’s ostensibly-but-not-really-neutral “framework” structure, perhaps even more than its conclusion, will give more ammunition to law enforcement officials when they repeat discredited calls to regulate encryption. That does the national (and international) conversation about encryption a disservice. We don’t need more voices giving cover to the misguided notion that legislating exceptional access is necessary and wise. We need more voices guiding the conversation to focus on a question that looks forward instead of backwards: How shall law enforcement continue to do its job in the age of ubiquitous encryption?

Look. Strong encryption is here to stay. The genie is out of the bottle, the cat is out of the bag. Rather than wasting time, effort, and taxpayer money perennially re-litigating a moot question, law enforcement should focus on the future. As Thomas Pynchon wrote, “If they can get you asking the wrong questions, they don’t have to worry about answers.” EWI’s report asks, and answers, the wrong question. As long as we keep doing that, we can’t move forward as we must do. The future is more interesting than the past. In the words of Alfred North Whitehead, “It is the business of the future to be dangerous; and it is among the merits of science that it equips the future for its duties.” In a world of near-daily headlines about massive data breaches, of hacking tools stolen from the spooks and released to the public, of talk about nation-states launching kinetic responses to cyber attacks, the future is still dangerous. But it is not dangerous because of encryption. Encryption is one of the best means we have of equipping ourselves for this brave new world. Stop asking us to trade that for a shoebox.

 

* What I’d call an exceptional-access mandate, the report calls “design mandates.” When required for government access, these measures are commonly thought of as “backdoors,” a loaded term EWI was trying to avoid.