The Declining Half-Life of Secrets

SCI_cover_sheet_(1967).jpg

Image credit: US Government via Wikimedia Commons

The following post is a preview of a new paper from New America’s Cybersecurity Initiative, where the author is a fellow.

The nature of secrets is changing. The “half-life of secrets” is declining sharply for many intelligence activities as secrets that in the past may have been kept successfully for 25 years or more, are now exposed well before.

For evidence, one need look no further than the 2015 breach at the Office of Personnel Management (OPM), of personnel records for 22 million U.S. government employees and family members. OPM is just one instance in a long string of high-profile breaches, where hackers have gained access to personal information, trade secrets, or classified government material. The focus of the discussion needs to be on complementary trends in information technology, including the continuing effects of Moore’s Law, the sociology of the information technology community, and changed sources and methods for signals intelligence, all of which increase the likelihood that government secrets will not remain secret for long.

An age where secrets become known sooner, means that “the front-page” test will become far more important to decision-makers. Even if a secret operation is initially successful, the expected costs of disclosure become higher as the average time to disclosure decreases.

The greater relevance of “the front-page test” has direct and important implications for governance of secret intelligence operations. For good security reasons, intelligence agencies have historically been insular, relying on heavily vetted employees, with proven loyalty and discretion, and working in Secure Classified Facilities surrounded by physical and electronic barriers. This insularity, however, makes it harder for intelligence agencies to predict how diverse outside actors will view revelation of a secret program. As the declining half-life of secrets is an important factual reason to bring greater transparency and more perspectives into the governance of sensitive signals intelligence activities. As of June 2015, the Obama administration had already taken a series of measures, consistent with the Review Group’s recommendations, in that direction. These changes, however, were difficult to accept within the intelligence community; understanding the declining half-life of secrets will help the community better assess what is possible and optimal for the less-secret future. 

During the Cold War, the United States developed the basic classification system that exists today. Under Executive Order 13526, an executive agency must declassify its documents after 25 years unless an exception applies, with stricter rules if documents stay classified for 50 years or longer. These time frames are significant, showing a basic mind-set of keeping secrets for a time measured in decades.

Nonetheless, three factors drive the decline in the half-life of secrets: the continuing effects of Moore’s Law – or the idea that computing power doubles every two years, the sociology of information technologists, and the different source and methods for signals intelligence today compared with the Cold War. In an upcoming article, I will ask the reader to contemplate the implications if important secrets often get revealed in months or a few years.

The Internet makes it easy to disseminate leaks.

Daniel Ellsberg struggled to get the Pentagon Papers published. The lawyers for the New York Times initially recommended not publishing, although they eventually relented. Gatekeepers such as newspapers can be sued and are subject to persuasion by government that a leak should not be published, as occurred for the Times itself when it decided to delay the story about warrantless wiretaps from before the 2004 elections until December 2005.

The gatekeepers, however, have far less power today to shut the gates. Wikileaks has shown that innumerable files can be posted to the Web without the assistance of the mainstream media. Reporters who received files from Snowden have in some instances decided not to print material due to concerns about harm to national security. But the ability of the government to rely on gatekeepers, to prevent publication, has declined sharply.

Other well-known trends of modern computing put secrets at further risk. The Internet of Things is based on a pervasive network of sensors. Big Data refers to the analytic ability to find patterns where none could previously be seen. Crowd sourcing means that far-flung individuals can coordinate their knowledge. Taken together, these trends can be applied to the activities of the intelligence agencies themselves. Spy satellites, for instance, can be followed from the ground based on data from amateur astronomers around the globe. Drone strikes and the CIA’s extraordinary rendition flights of a decade ago have similarly been discovered.

Intelligence agencies have long employed the mosaic theory, where multiple small bits of information about a target are brought together to form an accurate picture. Now, the ability to form the accurate picture has been democratized: given the smallest clue, reporters and the general public can often reconstruct an agency’s activities.

The Sociological Challenge to NSA Secrecy

At the same time that advances in computing technology are dramatically reducing the power of gatekeepers to control the flow of information to the public, the NSA and other secret intelligence agencies are facing fundamental challenges in the sociology of those keeping their secrets.

A Foreign Policy article from 2013 by science fiction writer Charles Stross emphasized the breakdown of lifetime employment, even for intelligence agencies. He argued leaks would become more common by “nomadic contractor employees” who have almost no loyalty to their employers and thus are willing to spill secrets. ACLU technologist Chris Soghioian has shown that contractors spill secrets for a simpler reason – they often list their work experience, including even the names of classified programs, on LinkedIn as they search for the next job.

The likelihood of leaks by techies goes far beyond the shift from 30-year employees to contractors. There is a cultural and philosophical chasm between Silicon Valley and Washington, exemplified by the question of whether Snowden should be considered a traitor or a whistleblower. During my work on the Review Group, I spoke with numerous people in the intelligence community. Not a single one said that Snowden was a whistleblower. The level of anger toward him was palpable.

By contrast, a leader in a major Silicon Valley company said during the same period that more than 90 percent of employees there would say that Snowden was a whistleblower. The gap between zero and over 90 percent is a sociological chasm. It does not bode well for intelligence agencies that depend on cutting-edge information technologists.

The NSA and other secret agencies thus face a formidable problem: how to guard secrets when much of the information technology talent they rely on has anti-secret and libertarian inclinations. The sociology of information technology professionals thus poses a systematic threat to intelligence agency secrets.

Changing Sources and Methods for Signals Intelligence

Signals intelligence during the Cold War used sources and methods that, in retrospect, we can see were relatively unlikely to lead to leaks. Vanishingly few communications in the Warsaw Pact crossed over into Western telephone or other communications systems. Much of the signals intelligence was done passively, such as by listening posts around the edges of the Soviet Union. Where surveillance was more active, it often was done with relatively trustworthy partners, notably AT&T and the other national monopolies. In addition, the sociology of the Cold War was conducive to secrets. The individuals who cooperated with the NSA were highly unlikely to wish to aid the Communists by revealing sources and methods. Where Soviet spies existed, the leaks were to the other side, and not to the general public.

By contrast, three changes in the methodology and targets of signals intelligence today make leaks far more likely.

First, signals intelligence agencies no longer have the Cold War luxury of focusing on geographically separate communications systems. Potential terrorists, as well as communications users in war zones such as Iraq and Afghanistan, use the same mobile phones, laptops, and other consumer devices as citizens of the United States and EU member states. Civilians and intelligence targets alike use the same operating systems, encryption protocols, apps, and other software. Because of this, exploits developed for the battlefield or to spot terrorists work against civilian systems, This convergence of citizen and target communications gives an important new rationale to leak – the public has a right to know about programs that spy on ordinary citizens and political dissidents. Members of Congress have a similar desire to uncover the truth, as shown by the repeated questions to the administration about whether the PATRIOT Act Section 215 telephone meta-data program gathered information on the calls of Senators and Representatives.

Second, the shift from passive listening posts to active intrusion is an additional reason why leaks are becoming more likely. Listening posts are generally outside of the area under surveillance, and the act of listening does not provide clues to the targets that they are being spied on. By contrast, intrusion carries with it the risk of detection. Intrusion detection is now a pervasive part of system security. Thus, penetration by an intelligence agency or others has to cope with sophisticated defensive measures that bring attacks to light. Once an intrusion is detected, system owners are getting much better at attribution.

Third, the nature of those holding communications has similarly changed. The Cold War was fought in an era of monopoly communications companies such as AT&T and government-run PTT’s (post, telephone, telegraph) in other countries. These large companies had longstanding relationships with the government. They employed cadres of individuals with security clearances, and often government experience, to carry out law enforcement and national security wiretaps.

Today, communications are spread across a large and growing number of companies that lack the same structures for trusted sharing with the government. When Facebook paid $1 billion for Instagram in 2012, the latter company had 30 million users but only 13 employees. In a world where competitive new social network and other communications providers arise constantly, many communications of interest to a signals intelligence agency take place within companies with no track record of keeping intelligence agency secrets.

In an age where it is all but inevitable that many important secrets will come to light, governments must plan for the possibility of disclosure. In practice this means a more systematic use of “the front-page test” for activities of intelligence agencies.

In an era when secrets may become public in the near term, intelligence agencies such as the NSA will need to communicate and engage differently with other policy makers and the general public. Agencies have long been reluctant to provide transparency for fear of follow-up questions and assistance to those applying the mosaic theory to the agency’s activities. But the failure to explain – to say it in ways that are persuasive on the front page – has greater costs now. The world responds to what it learns about the current activities of intelligence agencies. Ignoring those responses will be bad for the intelligence agencies and for the many goals of our nation and its allies.

 

About the Author(s)

Peter Swire

Huang Professor of Law and Ethics at the Georgia Tech Scheller College of Business, Former Chief Counselor for Privacy in the U.S. Office of Management and Budget (1999-2001), Former Special Assistant to the President for Economic Policy in the National Economic Council (2009-10), Member of President Obama’s Review Group on Intelligence and Communications Technology (2013)