On Feb. 28, U.S. President Joe Biden issued an Executive Order on Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern. The order directs federal agencies to prohibit the sale of bulk data, warning that foreign adversaries “can rely on advanced technologies, including artificial intelligence (AI), to analyze and manipulate bulk sensitive personal data to engage in espionage, influence, kinetic, or cyber operations or to identify other potential strategic advantages over the United States.”

Some critics were quick to point out that the U.S. government itself is in the business of acquiring vast amounts of commercially available information (CAI) from data brokers, which it regards as a subset of publicly available information (PAI). “For most Americans, the country of greatest concern on surveillance is the US,” Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project, told the Associated Press. “Americans are tracked every day by an increasingly invasive array of private data brokers and government agencies, transforming nearly every aspect of our digital lives into marketing and policing tools.”

The appetite of federal law enforcement and intelligence agencies for CAI has increased dramatically in recent years. Calls for reform are growing, and lawmakers in both parties appear to agree that new legislation is needed to limit the government’s use of CAI. There is bipartisan support for the Fourth Amendment Is Not For Sale Act, a bill that would require authorities to seek a warrant before acquiring certain types of CAI. The legislation advanced out of the House Judiciary Committee last summer. A companion bill in the Senate was also re-introduced last year, with bipartisan support.

While criticism directed at the federal government is appropriate and statutory reform is clearly necessary, in the White House and intelligence community, there appears to be a growing recognition that the way federal law enforcement and intelligence agencies use CAI is in need of reform.

The Director of National Intelligence Sheds Light on the Subject

Speaking at the 10-year anniversary event hosted by Just Security at New York University Law School on Feb. 29, Director of National Intelligence Avril Haines spoke about the challenging questions the government faces over the use of such information, and pointed to an effort by her office to establish clearer rules about the use of CAI:

Today, not only is an astounding amount of commercial information available to the public, but various actors, including adversaries, also have access to increasingly advanced analytic tools that they rely on. And among other things, artificial intelligence to exploit such information in new ways that exacerbate existing threats such as cybersecurity, and at the same time as more of our daily lives are connected to the digital world, including through the internet of things, the combination of an increasing amount of readily available data regarding the activities of individuals, often perceived as not especially sensitive on its own, alongside increasingly sophisticated analytic tools can in aggregate raise significant privacy and civil liberties issues.

It is refreshing to see this turn toward understanding the profound privacy and civil liberty concerns raised by the transformative power of AI-powered analytic tools in distilling insights about people from the collection of CAI that on its own would not necessarily raise such concerns.

Haines also mentioned a report prepared for the Office of the Director of National Intelligence (ODNI) that was released in a redacted form last year:

We in the IC [intelligence community] recognize this fact, asked a number of external experts to make recommendations regarding how and under what circumstances we should use commercially available information and in particular, to reflect on the existing framework for ensuring the protection of privacy and civil liberties. We made the report public and are following through on their recommendations. And we’ll continue to make public as much as we can on this issue, because such transparency is a foundational element of securing the public trust in our endeavors alongside the protection of civil liberties and privacy.

The report Haines referred to says it was prepared to “​​(1) describe the role of CAI in intelligence collection and analysis; (2) reflect on the existing framework for ensuring the protection of privacy and civil liberties; and (3) make recommendations to the IC regarding how and under what circumstances an IC element should collect, use, retain, and disseminate CAI.”

The U.S. government has long distinguished between private information, PAI, and CAI. Operationally, many agencies have effectively treated CAI as a form of PAI, regarding it as permissible to acquire and use. Indeed, the ODNI report acknowledges that “CAI (because it is also PAI) is less strictly regulated than other forms of information acquired by the IC.”

But the ODNI report suggests this understanding may no longer be appropriate. “In our view, however, profound changes in the scope and sensitivity of CAI have overtaken traditional understandings, at least as a matter of policy.”

Critics of the acquisition and use of CAI by the government – and in particular by law enforcement and intelligence agencies – regard it as a violation of privacy rights, since CAI may contain sensitive data that could be reasonably deemed as private, or may be combined or analyzed in ways that reveal additional information. Should the government come to this view, it would have substantial implications for the use of CAI in law enforcement and intelligence work.

And, it would bring the government closer to the views of some privacy experts, who have argued that commonly understood distinctions between what is public and what is private are too simplistic, particularly in the age of big data, machine learning, artificial intelligence, and mass surveillance.

For instance, the philosopher Helen Nissenbaum has long held that “theories of privacy should also recognize the systematic relationship between privacy and information that is neither intimate nor sensitive and is drawn from public spheres.” And in a recent academic paper, “Data Is What Data Does: Regulating Based on Harm and Risk Instead of Sensitive Data,” law scholar Daniel Solove underscores that “in the age of Big Data, powerful machine learning algorithms facilitate inferences about sensitive data from nonsensitive data. As a result, nearly all personal data can be sensitive, and thus the sensitive data categories can swallow up everything.”

A New Understanding May Be Difficult to Propagate to the Agency Level

If old notions around the use of PAI and CAI are replaced by a new conception of the risks and potential harms of the acquisition of personal information from public and commercial sources, how quickly might it propagate across law enforcement and intelligence agencies?

For one, it might mean that the agencies should actually know how much of this information it has already acquired, from whom they have purchased the information, and how the information is handled. The ODNI report acknowledges that determining these matters would be a “complex undertaking requiring attention to procurement contracts, functionally equivalent data acquisition processes, data flows, and data use.” The report suggests the executive branch’s use of such information lacks such internal governance at present. “The IC cannot understand and improve how it deals with CAI unless and until it knows what it is doing with CAI.”

And, a new understanding of the privacy implications would mean that individual agencies would have to develop or adhere to clear policies around the acquisition and use of CAI. But there are signs that this is not a conversation, at least not a public one, some officials want to have.

At a House Judiciary hearing last July, Federal Bureau of Investigations (FBI) Director Christopher Wray was questioned about the results of the ODNI report and the agency’s use of CAI. Rep. Pramila Jayapal (D-WA) asked a number of specific questions, including:

  • “Does the FBI have a written policy outlining how it can purchase and use commercially available information?”
  • “When was that policy last updated?”
  • “What about a written policy governing how commercially available information can be used in criminal investigations?”
  • “Does the FBI have a written policy interpreting the Supreme Court’s decision in Carpenter?”

Director Wray declined to answer directly these and similar queries from Rep. Zoe Lofgren (D-CA). Instead, he stated that questions concerning the FBI’s use of commercially available data were too complex to be addressed in the context of a hearing because of technical and security considerations, and that he would prefer to address them in a closed briefing. That response was nearly identical to one he provided to Senator Ron Wyden (D-OR) when asked similar questions at a Senate Intelligence Committee oversight hearing a year ago.

Perhaps indicative of how sensitive these questions are for the FBI, Director Wray did not move quickly to deliver a closed briefing. At the markup hearing in which the Fourth Amendment is Not for Sale Act was reported out of the House Judiciary Committee last year, Rep. Lofgren noted the lack of follow up. “The Director said that it was complicated, which I didn’t think it was, and that they would reach out to us,” said Rep. Lofgren. “And we’re still waiting for the outreach from the FBI. Just thought I would report that they never did follow up with the explanation of what they purported to claim was complicated.” As of this week, Rep. Lofgren’s office says it still has received no further information or briefing from the FBI on these questions.

The ODNI report concluded by suggesting that a “traditional working group” of intelligence community senior officials should convene to advance its recommendations to develop more substantive principles, procedures, and processes regarding CAI. At a future oversight hearing, perhaps Director Wray and other intelligence community officials could be pressed on whether such a working group was ever formed. That might indicate whether an apparent new consensus on the nature of CAI is emerging and ultimately may be applied to the U.S. government’s own use of it. In some respects, CAI in the hands of one’s own government is just or even more threatening to privacy and civil liberty interests than in the hands of a foreign entity.

IMAGE: Visualization of data (via Getty Images)