Show sidebar

Security “Front Doors” vs. “Back Doors”: A Distinction Without a Difference

Thursday, FBI Director James Comey delivered a talk at the Brookings Institution, titled “Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?” His thesis did not stray too far from his (and others’) recent calls for limitations on software from companies like Google and Apple that employs strong cryptography that even the companies themselves cannot break, even if law enforcement agencies produce a warrant for the encrypted data. These calls by law enforcement for companies to provide “back doors” to encryption and other security systems, through which companies could “unlock” the data by using, as one editorial board unfortunately put it, a “secure golden key they would retain and use only when a court has approved a search warrant.”

The problem with the “golden key” approach is that it just doesn’t work.  While a golden key that unlocks data only for legally authorized surveillance might sound like an ideal solution (assuming you trust the government not to abuse it), we don’t actually know how to provide this functionality in practice.  Security engineers, cryptographers, and computer scientists are in almost universal agreement that any technology that provides a government back door also carries a significant risk of weakening security in unexpected ways. In other words, a back door for the government can easily – and quietly – become a back door for criminals and foreign intelligence services.

The problem is chiefly one of engineering and complexity.  While a government backdoor might sound like a conceptually simple problem, security systems (especially those involving cryptography) are actually incredibly complex.  Even relatively simple systems that have been deemed as “obviously secure” frequently turn out to have subtle flaws that can be exploited in surprising – and often catastrophic – ways. And as complexity increases (and a backdoor adds plenty of it), this problem becomes exponentially worse.  The cardinal rule of cryptography and security is widely known: Keep it as simple and well-understood—by both engineers as well as average users—as humanly possible. Even here, this rule is not a perfect safeguard: even very simple security systems are hard enough to design and build reliably without adding the (significant) complexity of a “back door” for law enforcement. The history of cryptography and security is littered with examples of systems, some fielded for years, that fell prey eventually to subtle design and implementation flaws initially undetected by their designers.

Worse, a “back door” adds exactly the kind of complexity that’s likely to introduce new flaws.  It increases the “attack surface” of the system, providing new points of leverage that a nefarious attacker can exploits. It amounts to creating a system with a built-in flaw. As Apple, Google, and other similarly situated companies point out, why would customers pay for and use such a system? Companies are now awakening to the fact that, in a post-Snowden world, customers are becoming more savvy about security issues, and will discern between products on this basis. If companies like Apple, Google, Microsoft, and Cisco (just to name a few) are somehow forced to include governmentally mandated flaws in their products, these flawed systems become part of our national critical infrastructure, and the stakes become a lot higher than hacked cell phone photos or our address books.

In his talk, Director Comey stated that he’s no longer seeking a “back door” to crypto systems, but rather a “front door” to those systems. When asked what he meant by that, Comey stated that he has been told “by people smarter than [him],” that “any time there’s a door, there’s a risk that someone’s going to try to pick the lock…but if the door is built transparently” as a “front door” lawful intercept capability, “the chances of a vulnerability being unseen are much lower” than with a “back door” to that system. What we think Director Comey was trying to describe was the concept of “security through obscurity,” a widely disavowed practice where the security of the system relies on the secrecy of its design. Once that secret is discovered—and it’s always discovered, sooner or later—the system’s security is lost forever. One needs look no further than the many failed attempts at secure digital rights management (DRM) systems to see this in action.

But Comey’s “back door” vs. “front door” distinction is a false one, and only serves to confuse the issue. Arguments against the use of these intentional security flaws were widely aired in the early 1990s, and successfully prevented government proposals such as private key escrow (see, e.g., the Clipper Chip). Those of us who made these arguments against (formerly named) “back doors” to crypto systems did not assert that design transparency is the heart of the problem. As implied above, no cryptographer in her right mind would consider designing a “security through obscurity”-based system these days. Our concern is with the door itself—front or back. If we design systems with lawful intercept mechanisms built in, we have introduced complexity to the system, and have therefore made the system inherently less secure in the process. This is true even for systems with designs that are open for all to see and inspect. Thus, the difference between a “front door” vs. a “back door” approach to law enforcement intercept of encrypted communications is purely semantic.

What is especially disturbing about Director Comey’s not-back-doors-but-front-doors language lies in the false sense of security it will likely instill in those not familiar with the technological issues at stake—which is practically everyone. Even Comey himself repeatedly admitted today to not fully understanding the technology behind these systems, a statement we find somewhat troubling coming from an FBI Director asking to alter them. Further, Comey agreed that he is not interested in reviving the “key escrow” arguments of the first crypto wars in the 1990s, where government would have access to users’ private crypto keys held in escrow, but is speaking more “thematically” about the problem. If this is the case, however, it is hard to see just what law enforcement has in mind, short of a “golden key” created by the “wizards” working at Apple and Google. While we have no doubt that some of the smartest people in software reside within these companies, we think it is misleading and unproductive for law enforcement agencies to confuse the general public with meaningless distinctions and trust in magic solutions. We agree with Director Comey when he calls for a nationwide conversation about security and privacy, but we must all do our best to ensure that the conversation is well informed.

Tags: , , , , , , , , , , ,

About the Authors

is Lecturer in Law and Executive Director of the Center for Technology, Innovation and Competition at the University of Pennsylvania Law School. Follow him on Twitter (@jvagle).

is an Associate Professor of Computer and Information Science at the School of Engineering and Applied Science at the University of Pennsylvania. Follow him on Twitter (@mattblaze).