The latest Crypto War is being fought on multiple fronts: behind closed doors, in the courts, and now in Congress. On April 13, Sens. Richard Burr (R-NC) and Dianne Feinstein (D-Calif.), leaders of the Senate Intelligence Committee, officially released a discussion draft of the anti-encryption bill they had been promising since last December. Called the Compliance with Court Orders Act of 2016 (CCOA), the result of four-plus months of work is a misguided, dangerous, and technologically tone-deaf piece of legislation that would create far more problems than it could possibly solve. This post will summarize the bill, then discuss some of the copious problems it poses.
What the Bill Says
What does the CCOA require? Upon receipt of a court order or warrant for “information or data” sought by a federal, state, local, or tribal government in specific types of investigations or prosecutions, the CCOA requires covered entities to give the government the information or data in an “intelligible” (i.e., unencrypted) format, or to provide any “necessary” technical assistance to render it intelligible. The CCOA only kicks in if the data is “unintelligible” (i.e., encrypted) due to “a feature, product, or service” that is “owned, controlled, created, or provided” by the entity (or by a third party on its behalf). The bill says that no government officer can dictate or prohibit specific design requirements to comply with the law.
Who is covered? “Covered entities” include device manufacturers, software manufacturers, providers of wire or electronic communications services (ECS) or remote computing services (RCS), and “any person who provides a product or method to facilitate a communication or the processing or storage of data.” If a covered entity licenses its products, services, applications, or software, any ECS or RCS provider that “distributes” the licenses must ensure they can comply with the law’s requirements.
What must be provided? “Information” is not defined, but “data” is defined to include the contents of communications, identifying information about communications and the parties to them (i.e., metadata), information stored remotely or on a device made by a covered entity, and information identifying a specific device. The bill also indicates that covered entities must provide “technical assistance” to “isolat[e]” the information or data, decrypt it (if it was encrypted by the covered entity or a third party acting on its behalf), and deliver the information or data as it’s transmitted or expeditiously (if it’s stored “by a covered entity or on a device”).
Which court orders qualify? The official discussion draft narrowed the very broad definition of “court order” contained in an earlier leaked draft to orders or warrants issued by a “court of competent jurisdiction” in investigations or prosecutions of certain enumerated types of offenses. Those include violent crimes, serious drug crimes, federal crimes against children, espionage, and terrorism, as well as their state-law equivalents.
Where the Bill Goes Wrong
In short, the bill prohibits covered entities from designing encryption and other security features so encrypted data is accessible only to the user, not law enforcement nor the entity itself. This is what I would call “effective encryption,” but law enforcement derisively calls “warrant-proof” encryption. If you’ve been following the encryption debate over the past year and a half, you’ll recognize instantly that this bill is not the innocuous public safety measure that its name implies or that its sponsors would have the public think.
They aren’t fooling anyone (or at least anyone who’s been paying attention). The White House has refused to endorse the bill. Other members of Congress have condemned it, including Rep. Darrell Issa (R-Calif.) and Sen. Ron Wyden (D-Ore.), who has promised to filibuster the bill if it reaches the Senate floor. Perhaps sensing an uphill battle ahead, Burr and Feinstein scheduled an April 13 staff briefing (not an actual hearing) about the “going dark” issue with a lineup composed entirely of police and prosecutors. Not a single cryptographer or security expert, no one from civil society, no industry representatives for entities that would be subject to the bill.
Had those stakeholders been invited, here are five of the CCOA’s many flaws that they might have brought up:
1. It will make us less secure. The CCOA would prohibit covered entities in the US from implementing state-of-the-art data security in their products and services. It would, intentionally or unintentionally, effectively outlaw such cornerstone security concepts as end-to-end encryption, forward secrecy, and HTTPS, which encrypts web traffic against hackers, state-sponsored attackers, and other snoops (but which the Senate’s own website can’t handle). It makes covered “license distributors” responsible for the compliance of the software being distributed, meaning Apple’s and Google’s app stores would be on the hook for ensuring every app on offer has weak enough security to meet government standards. It would chill innovation by rendering it largely pointless to work on making software and hardware more secure, because only CCOA-compliant security architectures would be legal. (That said, magical golden key R&D would surely thrive.) The bill doesn’t even require that a covered entity take any measures to protect the vulnerabilities it must build in for law enforcement access, so as to at least make it harder for anyone but the “good guys” to use them. Burr and Feinstein mention the word “security” four times in their bill’s one-page preamble. I do not think it means what they think it means. Their bill serves only to harm security, not promote it.
2. It can’t stop terrorists and criminals from hiding their activities. The joke in the infosec community used to be that “when crypto is outlawed, only outlaws will use crypto.” The joke’s on Burr and Feinstein: No one — not even FBI Director James Comey — can seriously deny that sophisticated terrorists and criminals will still have access to “warrant-proof” encryption no matter what law the US passes. Not only are effective encryption offerings readily available from entities based outside the US, there are already millions upon millions of devices, apps, and software programs presently in use that employ the encryption to be banned going forward. The crypto cat is out of the bag, as New America’s Open Technology Institute put it, and law enforcement’s alarmist and unsupported “going dark” rhetoric can’t hide that fact.
True, many people might cling to the current versions of iOS or WhatsApp for as long as possible rather than update to a CCOA-compliant version. But as explained in an amicus brief I co-authored in the San Bernardino Apple case, resisting installing updates is a double-edged sword. Even if it prevents adding a known vulnerability (the one mandated by the CCOA), it doesn’t remove other security flaws that automatic updates are supposed to patch, leaving the user’s device(s) and data vulnerable to malicious actors. The CCOA’s perverse outcome would be that the law-abiding general populace gets a lower level of data security than do the sophisticated bad guys seeking to prey upon them.
3. There is no “middle ground” on encryption. This one-sided bill tries to hold itself out as the “middle ground” on encryption that politician after politician after politician has called for. It opines that American companies can and should “implement … appropriate data security and still respect the rule of law and comply with all legal requirements and court orders.” But as cryptography experts have repeatedly explained over the last two decades, there is no middle ground on this issue. Mandating a means of access for law enforcement simply isn’t “appropriate” data security. It is a vulnerability, whose use can’t be limited to “good guys” bearing a court order. This was true 20 years ago and it’s still true today. That doesn’t make it a “tired argument” as Burr has said. Galileo’s theory of heliocentrism wasn’t “tired” because Copernicus wrote the same thing 70 years earlier. The Church didn’t succeed in banning reality 400 years ago, nor can Burr today.
The scary implication of this “middle ground” lip service is that if “appropriate” data security means mandatory vulnerabilities, Burr and Feinstein must believe it’s inappropriate for companies like Apple to provide the best security they possibly can. That’s not the direction our national policy should be going. We’ve seen what passes for “appropriate” data security in Washington, what with the breaches and other “security incidents” at the OPM, IRS, FBI, and Healthcare.gov. Congress has no business trying to bring everyone’s data security down to the level of “good enough for government work.”
4. The bill is simultaneously extraneous to and in conflict with existing law. In an unusual gesture of respect for the judicial branch Congress typically snubs, the CCOA’s eponymous purpose is to require compliance with court orders. Burr says it’s just a “follow the rule of law” bill. But offering effective encryption is not illegal. And resisting legal process, including by moving to quash a subpoena or challenging the propriety of an All Writs Act order, isn’t holding yourself “above the law,” as the bill declaims in an unsubtle dig at Apple. Exercising due process rights is part of the rule of law. And once the legal options for pushback are exhausted, it’s already impermissible to flout a lawful court order. Civil and criminal contempt sanctions already exist to address noncompliance. Plus, the courts have long exercised broad inherent powers to enforce their orders. Not only would the CCOA create a number of serious other problems, the bill is extraneous and unnecessary for the law-and-order purpose it supposedly defends. Congress shouldn’t waste its time on a bill whose best feature is its redundancy.
Yet the bill simultaneously manages to conflict with existing law as well. How is it supposed to interact with the hard-won limitations codified in CALEA? The CCOA is the latest instance of periodically floated bills (e.g., here and here) that have been called “CALEA II,” and its “design limitations” proviso apes CALEA’s design limitations language. So is the bill intended to amend CALEA? Specifically, is it supposed to repeal the express carve-outs for “information services” (i.e., the Internet) and encryption? As my colleague Al Gidari has pointed out, “CALEA did not prohibit a carrier from deploying an encryption service for which it did not retain the ability to decrypt communications for law enforcement access, period.” The CCOA would mandate exactly the opposite, and I read its broad definition of “covered entities” to encompass both telecommunications carriers subject to CALEA and the information services CALEA explicitly exempted. But the CCOA says it subjects covered entities to its requirements “[n]otwithstanding any other provision of law.” These contradictory signals are confusing. Which is it? CALEA or CCOA? Feinstein was in Congress when CALEA was enacted. She and Burr shouldn’t implicitly repeal significant portions of a comprehensive legislative scheme through their shoddy drafting.
5. It’s costly in every way. The CCOA asserts a concern with “economic growth, prosperity, security, stability, and liberty.” But the bill would undermine all of them by kicking data security back to where it was in the ’90s — an era for which we’re still paying the price. Those values require safeguarding a technology that underpins the security of banking and commercial transactions (e-commerce accounted for $89 billion in sales during just the final quarter of 2015), helps defend our national security, shields intellectual property against corporate espionage, and protects privacy and speech rights. What’s more, if America’s hugely valuable tech sector (which Feinstein represents as the senior senator for Silicon Valley) isn’t allowed to offer effective encryption, it will lose out in the global marketplace to foreign competitors.
The bill would be expensive for the entities it covers, too. While plainly aimed at big companies like Apple, the CCOA would also impact small companies and dissuade new market entrants that could have been the next WhatsApp. Coming into compliance would require a major outlay of money and engineering resources. Building secure systems is really, really hard to begin with. Once you’ve introduced a vulnerability by enabling law enforcement access, you still have to secure your system as best you can (and you still can’t get it perfect). And that’s expensive. Yet the CCOA does not, as CALEA did, provide for compensation for compliance costs. It only requires compensation for costs “directly incurred in providing” technical assistance. (Even those direct costs could be high: As Orin Kerr pointed out, the bill requires all “necessary” technical assistance, without any “reasonableness” limitation as in the All Writs Act.) If Burr and Feinstein want to force covered companies to weaken their data security and reduce their global competitiveness, the least they can do is make it up to them by covering the costs.
* * *
There’s only one good thing about the Burr-Feinstein bill: It’s been given poor odds of passing. The discussion draft is clearly not finished — it doesn’t even address penalties for noncompliance — but there is no amount of work that could fix it. Rather than prolonging this embarrassment, Burr and Feinstein would do well to finally start listening to what our modern-day Galileos have been saying for two decades: There is no middle ground. There is no magic rainbow unicorn key. There is no going back.