As technology has improved to provide users more secure, encrypted communication options, law enforcement and intelligence agencies have pursued various ways to gain access to citizens’ communications, particularly under the guise of stopping the spread of child sexual abuse material (CSAM). I have previously discussed the problems with these approaches, including encryption back doors and regulating the content of app stores. A newly proposed technique known as client-side scanning (CSS) presents a possible solution to the challenge of investigating the trafficking of child sexual abuse material (CSAM), nominally without the need to degrade user security. But the issue of whether CSS actually resolves the law enforcement vs. strong encryption debate underscores an often overlooked and foundational question about how much control people should have over the technologies they own. The importance of this issue stems from two sources, the ubiquity of technological devices in our lives and the sheer bulk of personal information we entrust to them.
Law Enforcement vs. Strong Encryption
For nearly thirty years, U.S. law enforcement and national security agencies have criticized the use of strong encryption in computing and communications, claiming that cryptographic systems will severely curtail legitimate law enforcement activity by making data unreadable and thus rendering digital police investigations useless. This argument has met with little success, due mainly to the lack of evidence that law enforcement investigations have been impeded as much as had been claimed, as well as a broad acknowledgement of the critical role encryption has come to play in our everyday security. Indeed, Julian Sanchez has recently explained the important role encryption has played in American society since the founding of the nation.
The distribution of child pornography online, however, has sent law enforcement agencies back to technology companies once again to find a way around the encryption conundrum. Specifically, because strong cryptography can stymie police efforts to investigate the trafficking of CSAM across the internet by rendering data unreadable to anyone without access to the key, law enforcement agencies will be unable to detect or disrupt CSAM distribution networks.
CSS gets around the encryption challenge in investigating CSAM by identifying targeted files through data scans on the user’s computing device before the user can encrypt them. By conducting these searches locally on the user’s computer at key moments in data processing when the information is still readable, rather than through a wiretap or device seizure, CSS allows law enforcement agencies to avoid the “going dark” problem posed by strong encryption. CSS thus appears to be a solution to the extended fight between law enforcement and technology companies over the use of encryption.
A recent article by several widely respected computer security experts has questioned the wisdom of CSS’s use, however. In addition to illustrating several ways CSS can fail, such as by mistaking innocuous content for targeted material, or be circumvented or hijacked by bad actors, the article’s authors point out that CSS weakens a key purpose of encryption—the desire of law-abiding users to avoid extralegal or unwanted surveillance. Others have pointed to the dangers of abuse of CSS, where private data can be scanned by others, even if the user has not given explicit permission to do so. But in addition to these valid points, there is an even larger issue raised by CSS, one that goes to the core of what it means to own a computing device today.
The User Control Debate
The nature of computers makes everything but the interfaces we use to interact with them opaque to most of us, and that is by design. There is quite a lot going on inside our devices that we would rather not have to worry about just to get our computers to do what we want. Those of us who are old enough to remember manually installing device drivers and editing configuration files probably do not relish a return to those days. But that does not necessarily mean we wish to relinquish control over our devices as part of this usability bargain. Like the strong encryption question, this debate about user control over the technologies we own has also been going on for decades, but often less visibly. In fact, technology users have been slowly losing this debate without necessarily knowing of its existence or what is at stake.
Until relatively recently, computers were general-purpose machines. That is, their owners could use them as they pleased, installing or removing components or software as it suited them, and controlling which processes could or could not run on the device. This landscape began to change when technology and entertainment industries created technologies to control user access to content, and strategies like digital rights management started to move certain parts of our devices out of the owner’s reach. This is something like buying a house where one room is permanently locked and only the builder has access. You can hear machinery operating in that room, but you have no way of knowing what those machines are doing, and there is no way for you to turn them off. Existing laws and policies have driven these changes, which have in turn quietly adjusted our customary ideas about ownership, at least with respect to digital devices and content.
These changes have taken place largely due to efforts to protect intellectual property, and efforts by law enforcement and national security agencies to influence similar changes have generally failed. But for technologies like CSS to work, portions of every computing device must also be walled off from user visibility and control. Search algorithms must be installed on every computing device to enable the scanning of data going in or out. If we allow this partitioning of device access and control for reasons of intellectual property protection, why not law enforcement or national security?
It is a fair question, and there are many who argue that we have already given up too much device control to the former. But CSS poses particularly troubling problems that go beyond existing objections. While CSS algorithms can be configured to scan for CSAM, they can also be configured to search for any other data others might be interested in knowing you have on your device, including private communications, location data, and personal documents. It does not require too active an imagination to foresee how such a system could be abused. And even though you “own” the device, there is nothing you can do about it.
One can understand the idea that technology ownership is an artifact of simpler days gone by, and that change is necessary to fairly balance the many interests at stake in our increasingly complex society. As technologies change and our uses of them evolve, we should frequently pause to evaluate their costs and benefits to society. Given the worthy goal of countering CSAM trafficking, we could conclude that using CSS on our devices is an appropriate solution. But before reaching that conclusion, we should fully understand what it means when we no longer control the technologies we think we own.
Our use of these devices has rapidly expanded to nearly every corner of our lives, and this has meant that their use is all but a requirement for participating in contemporary society. Because of their importance, we rarely think twice about giving these technologies unfettered access to our most sensitive data. But this requires a significant measure of trust that access to our devices is within our control—we decide who can or cannot see what information we put there. Maintaining that trust means ensuring users retain control over the devices they own.