Encryption, Anonymity, and the “Right to Science”

“The right to science is sometimes considered a prerequisite for the realization of a number of other human rights.” – Farida Shaheed, UN Special Rapporteur on the field of cultural right

While much of the recent debate around digital rights has focused on rights to freedom of expression and privacy, the debate’s next phase requires an assessment of the impact that government efforts to compromise security technology have on economic, social, and cultural rights. The resurgence of anti-encryption, anti-anonymity rhetoric and policy calls for defenders of digital freedoms to engage in even broader thinking about digital rights in order to prevent a potentially disastrous compromise of individuals’ digital security.

It’s no secret that the current outlook for robust security measures online is grim (as reflected in the civil society submissions to David Kaye, the UN Special Rapporteur on freedom of opinion and expression for his report on encryption and digital anonymity). Law enforcement officials trumpet the threat of “going dark” as part of the US and UK governments’ argument that encryption and anonymity could stymie law enforcement and intelligence agency investigations into terrorists and criminals. They claim the answer to this threat lies in maintaining governments’ ability to access intelligible digital communications data and content, including through cooperation from telecommunications companies. Numerous other states around the globe are exploring similarly restrictive practices.

Though we may not immediately recognize it, economic, social and cultural rights are fundamental to the contest surrounding the role of encryption and anonymity in the future of the Internet. That’s because these rights are inseparably tied to the “right to science,” which includes access to the benefits of the latest advances in digital security. 

Article 27 of the Universal Declaration of Human Rights (UDHR) and Article 15 of the International Covenant on Economic, Social and Cultural Rights (ICESCR) both lay out the “right to science” as an important element of economic, social, and cultural rights. The UDHR states that “[e]veryone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits.” The ICESCR recognizes “the right of everyone … [t]o enjoy the benefits of scientific progress and its applications.”

In the words of Shaheed: “Given the enormous impact that scientific advances and technologies have on the daily lives of individuals … [the right to science] must be read in conjunction with freedom of expression, including the freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers.”

Encryption and anonymity tools derive from scientific process and progress, and their use to defend free expression and privacy merits protection under the ICESCR. Encryption is fundamentally based on mathematics, while anonymity software such as Tor is designed as “a circuit of encrypted connections through relays on the network.”

To once again quote Shaheed:

Science must be understood as knowledge that is testable and refutable, in all fields of inquiry, including social sciences, and encompassing all research. The terms “benefits” of science and “scientific progress” convey the idea of a positive impact on the well-being of people and the realization of their human rights. The “benefits” of science encompass not only scientific results and outcomes but also the scientific process, its methodologies and tools.

The development of encryption and anonymity tools relies on scientific knowledge and processes and draws from the formal sciences of mathematics and computer science. These tools positively impact the digital security of the vast majority of the public — protecting against the theft of consumers’ financial data as well as defending against politically motivated digital attacks. They are thus properly considered benefits of scientific progress to which the public has a right.

The need for encryption and anonymity is constantly proven all too real, as government surveillance efforts pit state concepts of national security against individual, human security. Both mass surveillance and government-linked targeted digital attacks against civil society exploit unencrypted network traffic and data at rest.

Digital attack techniques such as network injection — a capability that is offered for sale by Western spyware companies, and developed in-house by powerful governments like the US, UK, and China — rely on unencrypted transmission of data, and have dramatically compromised freedom of expression and privacy. In the face of these outsized threats, individuals and civil society organizations require real options to protect themselves in the digital space.

Two additional aspects of the application of economic, social, and cultural rights are particularly relevant to discussion of digital rights.

First is the matter of scientific research. Article 15(3) of the ICESCR provides that “States Parties to the present Covenant undertake to respect the freedom indispensable for scientific research and creative activity.” The security research community that develops protections against the digital threats we face requires freedom to do its work. That work includes not only development of encryption and anonymity software, but also investigation of computer exploits. Legal, regulatory, or policy measures enacted to address digital threats, including export regulations, must protect such research activity pursuant to Article 15(3).

Second, and conversely, the development and application of certain technologies that serve the purpose of violating human rights may also implicate the right to science. Greater oversight and regulation of such activity, particularly in the commercial spyware market, is essential. The UN Committee on Economic, Social and Cultural Rights has indicated:

States parties should prevent the use of scientific and technical progress for purposes contrary to human rights and dignity, including the rights to life, health and privacy, e.g. by excluding inventions from patentability whenever their commercialization would jeopardize the full realization of these rights.

And Article 27(2) of the World Trade Organization Agreement on Trade-Related Aspects of Intellectual Property Rights says:

Members may exclude from patentability inventions, the prevention within their territory of the commercial exploitation of which is necessary to protect ordre public or morality, including to protect human, animal or plant life or health or to avoid serious prejudice to the environment, provided that such exclusion is not made merely because the exploitation is prohibited by their law.

Yet corporate and government developers of technologies that raise significant human rights concerns have sought to patent their inventions. For example, Hacking Team filed a patent application for its network injection tool, while a Chinese government entity called the National Computer Network and Information Security Management Center has filed numerous applications for technologies that appear designed for surveillance of public networks (see here and here). Authorities responsible for patent issuance should consider establishment of independent review processes to further interrogate designs that incorporate elements enabling electronic surveillance. Authorities should decline to issue patents for inventions that, according to independent assessment, have as their predominant purpose and probable use the compromise of digital security in violation of internationally recognized human rights.

While it is increasingly accepted that the same rights apply online as offline, the technical layer that exists between human beings and the exercise of our rights online merits further consideration and recognition. The technical underpinnings of individuals’ digital experience raise a host of unique concerns: Do we have the right to secure code? (And how do we define “secure?”) Do we have the right to research, develop, and use robust security protocols in our digital communications? How do governments attempt to control or undermine such protocols? What barriers to entry exist in public access to secure software and hardware, such as technical expertise or product cost? In the context of development, are we appropriately addressing the right to access not just technology, but secure technology?

The encryption and anonymity debate requires us to dig deeper into aspects of human rights that are particular, unique, to the digital space — to address our “right to science,” and the freedoms that right enables. 

About the Author(s)

Sarah McKune

Senior Researcher for the Citizen Lab at the Munk School of Global Affairs, University of Toronto