The world’s foremost software engineers and cryptographers are gearing up to take on the National Security Agency (NSA), hoping to develop and implement technological solutions that could thwart the agency’s bulk surveillance methods.  This is the message from last week’s RSA conference and this week’s Internet Engineering Task Force meeting, two prominent events in the tech community.  But if you speak with engineers involved in this effort, you will find that some are actually pessimistic about their own prospects for success.  They will note that the NSA’s roughly $10 billion budget dwarfs their own, that they will always be playing catch-up against an agency with secret capabilities, and that NSA can throw large amounts of resources against new technological challenges.  This is what I call the myth of NSA omnipotence.

Members of the national security community (although perhaps not those in the NSA itself) also believe the myth.  Their perceptions are shaped by the decade after 9/11 in which the NSA seemed capable of solving even the most intractable collection challenges.  This is why many discussions about the fallout from Snowden revelations focus on possible negative responses from Congress, which might pass overly restrictive legislation, and from adversaries, who might modify their operational security.  Fewer national security wonks now question NSA’s ability to respond to technological advancements that emerge in response to these disclosures.

Those who believe in the myth should take a closer look at accounts of NSA surveillance in the years immediately preceding 9/11.  These paint a remarkably different portrait of the agency than the one we have been provided by Snowden.  NSA as of early 2000 had “fallen behind and was in danger of irrelevance,” and was “digging out of a deep hole.”  The Senate Select Committee on Intelligence (SSCI) as of early 2000 was “increasingly troubled by the National Security Agency’s (NSA) growing inability to meet technology challenges and to provide America’s leader with vital signals intelligence.” The 2000 National Commission on Terrorism found that NSA was “losing its capability to target and exploit the modern communications systems used by terrorists.”

This situation came to a head on January 24th, 2000, when the computer system at the NSA’s headquarters crashed, halting its surveillance efforts.  Essentially, America’s entire signals intelligence collection went dead for a period of three and a half days, until the system came back online on January 28th.  The Los Angles Timescharacterized the event this way: “[T]he blackout of the world’s most powerful collection of supercomputers is hard evidence of the vast problems facing America’s largest and most secretive intelligence agency.  By all accounts, the NSA has lost its lead—and perhaps it way—in the information revolution it helped create.”

While some of this might be hyperbole, it seems clear from these characterizations and from the capabilities we have learned about recently that NSA made a remarkable turnaround.  By all accounts, NSA has regained its lead.  How did NSA make this short trip from “irrelevance” to “omnipotence?” One word – resources.  The national intelligence budget grew by roughly 130% between 2001 and 2012.  Although the actual figures are classified, it is a safe assumption that NSA’s budget followed roughly the same trend line.  What this means is that NSA could easily find new resources to dedicate to new threats or new technological challenges. This stands in contrast to the situation in the early 1990s, when the NSA budget and workforce were cut by roughly one third.

In an environment in which information technology evolves quickly, what matters more than NSA’s topline budget figure is the trajectory of that budget.  With a flat or declining budget, NSA must more clearly set priorities, divert resources from one program to another, and jump through the bureaucratic and legislative hoops to divert those resources.  What makes this all the more difficult is that it is often not clear exactly where the technology is going, which forces NSA to make risky bets and invest resources in areas that might not ultimately pay off.  That is what the agency was forced to do, with marginal success, during post-Cold War budget cuts. 

The intelligence community and NSA, in particular, have historically been very bad at aligning resources with new intelligence priorities.  According to the same 2000 SSCI report, “as resources have been reduced, the NSA systematically has sacrificed infrastructure modernization in order to meet day-to-day intelligence requirements.”  Former NSA Deputy Director for Operations Rich Taylor has stated, “There were budget cuts starting in 1991, and the people who should have been proactive then in thinking about how NSA was going to handle budget cuts relative to also making changes [in response to new technology challenges] didn’t act quickly enough and therefore had to react when there were fewer resources to fight over.” Former NSA Director Michael Hayden, in testimony before the congressional joint inquiry into the 9/11 attacks, said that when he did try to divert resources to new challenges, he faced opposition from within the Executive Branch and from Congress.

Today, the trajectory of NSA’s budget going forward looks more similar to the 1990s than to the 2000s.  That budget will almost certainly remain flat (in the best case scenario) or decline over the next decade, forcing the agency to think harder about its priorities and resources.  And there is some reason to believe NSA will respond no better today to budget constraints than it did in the 1990s.  The Director of National Intelligence was created to help with this very situation and to ensure that intelligence community resources can be effectively tasked against new challenges. But many question whether he has sufficient authority or will exercise the authority that he does have.  Further, it took roughly ten years, from late 2001 to 2011, for NSA to jettison its bulk Internet metadata collection program, which IC officials now acknowledge had little operational value.  NSA officials are also fighting to maintain their bulk phone records collection program, which multiple review groups have examined and found to be of limited utility.  These examples do not bode well for NSA’s ability to respond adroitly to current budgetary realities.


Returning to the subject of today’s anti-NSA engineering efforts in the tech community, the brunt of this initiative will be directed at encouraging the development and uniform adoption of encryption and security standards that would make it difficult for NSA to collect communications in bulk (what the Internet Engineering Task Force calls pervasive passive monitoring).  This would notionally force the agency to develop surveillance methods that require much more sophisticated engineering in order to engage in bulk collection, while leaving openings for more targeted collection against specific individuals.  Although I think bulk collection does have its uses in select circumstances, I’m generally in favor of this goal because it can actually make for more effective intelligence collection, decreasing the signals-to-noise problem that sometimes hampers good intelligence work.  The efforts of the tech community will both pose a challenge to NSA and represent an opportunity to develop a smarter approach.  This leads me to the following two conclusions:

First, those engineers prepared to build defenses against bulk collection should not be deterred by the myth of NSA omnipotence.  That myth is an artifact of the post-9/11 era that may now be outdated in the age of austerity, when NSA will struggle to find the resources to meet technological challenges.  Second, intelligence policymakers should be aware of the dangerous interplay between today’s budgetary environment and prospective technological developments that, if successful, may decrease the value of bulk surveillance. They should be vigilant in their efforts to ensure that more targeted surveillance approaches are fully funded and that NSA shifts scarce resources away from bulk collection platforms that may be of decreasing value.