The Need for Both Legal and Technical Privacy Protections

Last week, Apple and Google came under intense criticism from the law enforcement and national security communities for their decisions to encrypt user data when devices are locked. Their new features mean that the companies can no longer technically comply when they receive warrants from the government requesting that data.

Civil liberties advocates praised these announcements. Others chastised the companies for implementing design decisions that largely serve to thwart legal government requests. I will return to these criticisms below but first want to state upfront that I don’t think these changes will have a major impact on user privacy or law enforcement capabilities. Rather, the new features are notable because they illustrate a broader and much more important trend.

United States-based companies are increasingly making technical decisions that thwart authorities ability to collect user data from them. These decisions include efforts to: 1) be more judicious about what user data is collected in the first place; 2) encrypt data and store it in locations that may be out of the reach of governments’ legal authorities; and 3) design access regimes such that the companies themselves cannot decrypt data and therefore cannot satisfy government requests. I call these hybrid legal-technical surveillance countermeasures because they are technical mechanisms designed explicitly to address what some see as shortcomings of current law. They complement purely technical measures, such as efforts to eliminate security vulnerabilities, and purely legal measures, such as efforts to more consistently challenge government data requests.

These decisions are being driven both at an engineering level, in reaction to a perception that the US and its fellow Five Eyes governments have overstepped their surveillance authorities, and at a business level, in response to customer concerns about government access to data. It is too early to tell how significant these developments will be; business incentives are still in favor of greater collection, aggregation, and sharing of data, which will create headwinds for any corporate effort to thwart government data requests. But we can safely assume that this new trend will create additional challenges for the law enforcement and national security communities.

Lest we conclude that this is an entirely negative development, we should recognize that the new trend follows a decade during which many US companies designed their technology in ways that enabled law enforcement and national security authorities. Apple’s decision here is a case in point. The company originally, years earlier, implemented a security feature that allowed it to access a customer’s iPhone data. It likely did this for convenience reasons or because it simply didn’t occur to engineers involved that an alternative approach might be better. That decision, along with the rest of Apple’s design decisions, led to the market-saturating popularity of a device that stores so much personal information, greatly increasing government access to data about prospective law enforcement and national security targets. Thus, Apple’s recent move simply reversed one earlier decision. It represents a return to an older status quo rather than a nefarious plot to unilaterally degrade law enforcement capabilities.

The dynamic over the previous decade, in which technology enabled legal authorities to collect ever more data, served the public interest in cases in which evil-doers recorded more data about their evil-doing by making that data readily available to law enforcement. But it also increased the availability and accessibility of data about everybody else, throwing the balance between privacy and security out of whack. The Supreme Court’s recent decision in Riley v. California essentially recognized this reality and raised the legal hurdle for data access by requiring a warrant for police to be able to search a cell phone.

Apple and Google have been criticized for making technical changes that have the sole effect of thwarting lawful warrants. The arguments made by critics of Apple and Google assert that these changes will result in damage to the public interest by protecting criminals and those who want to do us harm. More importantly, they suggest that warrant protections are necessary and sufficient to safeguard the public interest and to adjudicate the circumstances under which the government may access data. By extension, any technical mechanisms that go beyond those warrant protections must therefore detract from the public interest.

These arguments ignore the benefits described above that have accrued to law enforcement over the last decade. Boiled down to their essence, they lead to the conclusion that technology should facilitate greater government access to data but, when it comes to actually restricting government access, only legal processes such as those put in place by the Riley decision are appropriate. Given the remarkable role that technology has played in facilitating legal data collection, this would seem to be an absurd conclusion.

Naturally, law enforcement officials have more faith in legal processes than would engineers, who are far more disillusioned about the current mechanisms available to protect the public. Those engineers point to FISA court opinions and to the more general failures to reform laws like Electronic Communications Privacy Act as examples where legal processes have not adapted appropriately to technological realities. In this context, the Riley decision offers small solace to those who believe that the current legal framework is now inadequate to the task at hand. Those engineers are now taking matters into their own hands and believe their solution set is better able to protect the public. I’d personally still opt for the warrant protections and other legal processes, but I don’t see why one should have to do so to the exclusion of technical mechanisms that might provide additional safeguards.

The pendulum may be starting to shift slightly away from technology that enables legal authorities to technology that thwarts those authorities. This means that the minor issues raised by Apple’s recent announcement will likely come to the fore in the near-term in cases with far more impact on privacy and law enforcement. When that occurs, we should appreciate that there is an appropriate role for both legal and technical hurdles to government data collection. 

About the Author(s)

Marshall Erwin

Head of Trust at Mozilla, Non-Residential Fellow at the Center for Internet and Society at Stanford Law School, Former Intelligence Specialist at the Congressional Research Service, Former Counterterrorism Analyst