Co-published with Tech Policy Press.

On Wednesday, United States Magistrate judge Zia M. Faruqui issued an order in the highly important case of The Republic of Gambia v. Facebook, Inc. The Gambia seeks content from Facebook that relates to the genocide in Myanmar in order to assess “responsibility for genocide” against the Rohingya before the International Court of Justice (ICJ). Facebook’s role in propagating hate speech against the Rohingya and facilitating the genocide is well documented

As the judge points out in his opinion, Facebook argued that The Gambia’s request  “(1) violates the Stored Communications Act (SCA), and 2 is unduly burdensome.” But Faruqui ruled in favor of The Gambia. Stating that while Facebook did the right thing in deleting content targeting the Rohingya, he criticized the company for failing to provide the evidence needed to hold those responsible for the genocide to account. “Failing to do so here would compound the tragedy that has befallen the Rohingya,” he wrote.

The decision has important implications for what access investigators will have to content deleted from social media platforms going forward. For a better understanding, we posed a set of questions to Dr. Alexa Koenig, Executive Director of the Human Rights Center, a lecturer at UC Berkeley School of Law, and founder of the Human Rights Investigations Lab, asking her to discuss the judge’s decision and its broader consequences. 

1. How might this ruling potentially empower international tribunals to access certain kinds of evidence?

This ruling potentially empowers international tribunals to obtain information that has been removed from social media sites for violation of terms of service but that social media companies can still access internally. The judge made a number of interesting findings: the most significant is his determination that content that has been taken down for violation of Facebook’s terms of service falls outside the ambit of the Stored Communications Act (SCA). This was the basis for the court’s ordering Facebook to share the requested content with The Gambia, including posts from Myanmar officials and Facebook’s internal investigation documents (pp. 28-29 of the opinion) related to those posts. 

The SCA bars social media companies from knowingly sharing their users’ content with others with only a few exceptions, and it does so for several important reasons, namely the protection of users’ privacy and to protect “fourth amendment-like” concerns (order p. 9), including the prevention of unreasonable searches and seizures. In this case, the court determined that the SCA does not stand in the way of Facebook disclosing the requested data.

In addition, the court’s ruling underscores the ability of foreign governments to use a federal statute (28 USC § 1782) as a mechanism to request help from U.S. courts to obtain social media content that has been removed for terms of service violations and is no longer publicly available online. Typically, foreign governments like The Gambia would make a Mutual Legal Assistance Treaty (MLAT) request for information. That process has been repeatedly criticized for being unwieldy and slow. Another mechanism is for foreign governments to use the CLOUD Act to request information. The CLOUD Act provides for a more streamlined process than the traditional MLAT process, but it requires the United States to have first entered into a bilateral agreement with the foreign country after assessing that the country has sufficient due process protections in place to support permitting a fast tracked process. Most countries do not yet have such an arrangement with the United States, and so this isn’t an option for most foreign government requests. Section 1782 is yet another  mechanism that foreign governments can use; they can ask US courts for help in compelling the production of data. Courts have significant discretion about whether to issue an order based on a §1782 request. That was the mechanism used here. 

2. The decision applies to content that was previously public, including self-declared “private” groups that include large numbers of people. What do you think of where the court drew the line here?

The court in The Gambia decision sweeps information that many might consider “private” into the “public” exception to the SCA. The Court spends quite a bit of time establishing that private Facebook groups might be considered public in situations where, for example, administrators automatically grant access to a private group such that it functions like a public page. They also emphasize that the Myanmar officials whose posts are at issue “clearly” intended their posts to reach the public, given their desire to spread hate, which–to be widely effectuated–depends on having a relatively broad reach. 

I think the court was trying to find a way to ensure that critical information that could be helpful for evidentiary purposes could be made available. The fact that this is a case involving allegations of a particularly horrific crime like genocide–with widespread and shocking examples of the ways in which social media was used to incite physical violence against the Rohingya–likely played a role in its analysis. The court seems to recognize how critical the content could be for helping The Gambia better understand who may have been involved in inciting genocide, how the process played out, and the relative impact the posts might have had with regard to what transpired on the ground in Myanmar.

While it makes sense that purportedly private posts that reach millions of people may be functionally public and thus operate within a grey zone–and that people who are trying to disseminate dangerous hateful content shouldn’t be able to hide behind a private account that has broader reach than most public ones–this is a potentially slippery slope that could become dangerous if not carefully guarded. 

Ultimately, the court attempts to narrow the applicability of its finding (and any future damage) by stating that “it is the rare case here that the authors nakedly displayed their intent to reach the public and such intent was independently confirmed”–emphasizing that this is a relatively circumscribed set of circumstances in which private groups might be considered functionally public. 

3. Does this decision disincentivize Facebook from permanently deplatforming bad actors? Because if they permanently deplatform the decision applies, but does not necessarily apply if they do not permanently deplatform. 

There are a lot of moves that Facebook could make in response to this, including not permanently deplatforming bad actors but also not holding onto the information that’s taken down as part of that de-platforming process. However, allowing bad actors to continue to act out on its site is a potentially onerous and dangerous path for Facebook, as it will have to repeatedly and continually assess the behavior of deeply problematic accounts — not to mention the outrage if the company allows dangerous communications and other behavior to continue to proliferate on its sites even after that’s been flagged internally or by external watchdogs. However, this decision also creates a potential incentive for Facebook to expand its participation in ongoing conversations about how to handle social media content at risk of removal that has significant potential evidentiary value. One option, should it not want to hold onto the content, is to share that information with an external repository, such as a digital evidence locker designed to hold content related to social media content relevant to international crimes like genocide, war crimes, and crimes against humanity. The creation of such an evidence locker is something that several human rights organizations and academics have called for, including Human Rights Watch, the Human Rights Center at UC Berkeley, and affiliates of Harvard, among others. Of course, any system that is created will need to have significant due process protections to safeguard and balance the competing human rights and civil rights issues at stake. But we’ve seen several potential precedents for such a system, ranging from the evidence collection mechanisms that have been created to aggregate evidence of atrocities in Syria and Myanmar, including social media evidence; to hash repositories for terrorism-related content; to information sharing mechanisms for European law enforcement.

4. What unanticipated negative consequences could emerge from this ruling for human rights activists and others? For example, the decision places emphasis on the fact that these were inauthentic accounts. What might it mean for anonymous accounts in repressive countries?

The opinion is relatively narrow in ways that limit the risk of significant downstream harms with respect to this set of concerns. However, all decisions raise the potential for unanticipated negative consequences–and there’s a long line of examples where a case apparently supports civil or human rights, but eventually becomes a problem for the very people it originally seemed to help. One significant concern, of course, is that governments will try to use this decision to secure information about human rights activists and other disfavored individuals and groups by arguing that their accounts are inauthentic or that they are part of a network engaged in coordinated inauthentic behavior. We’ve seen so many instances of human rights activists posting videos, photos and text to social media of violent or graphic incidents to bring global attention to those atrocities, and having that content removed and their accounts shut down–sometimes temporarily, but also permanently. These are individuals and networks that are already vulnerable, and might become more so if governments successfully use this court order — or the reasoning of the judge’s opinion — to increase their access to removed content.

Of course, other unintended negative consequences could include Facebook and other social media companies becoming even less aggressive about removing harmful content, as already discussed above, or being less clear about whether the deplatforming of particular accounts is permanent. This is likely to be a focus of future litigation in situations where the permanence of the removal is ambiguous.

5. What do you make of Facebook’s arguments?

I think Facebook is right to argue in support of privacy and to defend the privacy interests of its users, and its lawyers raise an important policy consideration with their argument that this holding could make a large number of deactivated accounts subject to disclosure  along with content removed by other social media companies. However, legitimate legal processes, with significant due process protections, can minimize the potential harm to privacy. In addition, the court stresses that this fact pattern concerns coordinated, inauthentic behavior–not “genuine communications from real users”–suggesting that the risk is far less than Facebook suggests.  

Ultimately, none of the values that are implicated by removal processes–including privacy–can be considered in a vacuum. From a rights perspective, this case also implicates freedom of expression, access to information, and accountability interests. It’s important for courts to try to strike a balance between those other competing human rights interests, or at least determine in which direction the balance of justice should tilt based on the relevant law and the specific facts of a case. In this decision, the court is struggling to apply a highly outdated law to a contemporary fact pattern; in many ways, the tool they are forced to apply isn’t fit for purpose. I think this case puts additional pressure on Congress to update the SCA and bring it more squarely into the 21st century. 

As for Facebook’s argument that the removed information is a backup and thus is subject to SCA protections because it is stored adjacent to active content: that seems like a stretch. In addition, the Court raises an important point about the legislative intent underlying passage of the SCA. As stated in the order, “the SCA was created to allow platforms to flourish for users not to protect records for a provider” (opinion  at p. 17). It’s unlikely that Facebook is suddenly going to struggle to keep users or gain new ones as a result of this decision, as the judge notes as well. 

Facebook also argued that providing the requested information would be “unduly burdensome,” the standard for whether a court should exercise its discretion as to whether to allow discovery under §1782, and that The Gambia’s request is overly broad. However, the court rejected these arguments, pointing out that The Gambia asked for a specific set of records, namely those that Facebook had identified and deleted from its platform due to terms of service violations specific to hate speech and related to genocide in Myanmar. Given the scale at which Facebook operates, and the many variables involved (removed content, content removed for a specific reason, within a given date range, and specific to a particular geographic region), it seems Facebook could make a good faith attempt to comply with the order.  

Finally, with regard to Facebook’s argument that the SCA makes the disclosure of exempted content discretionary to Facebook since the SCA states that when there is an exception, the provider “may” share content: that’s a difficult argument to sustain. As expressed in the judge’s opinion, it’s more logical that the word “may” signifies permission, but not discretion, to share content in the face of a lawful request. As stated in the order, “the Court cannot logically conclude that Congress gave Facebook greater power over discovery than the judiciary” (order at 21), which would be the case if Facebook had the authority to decide whether to turn the content over or not.

6. The opinion turns on the fact that the content was not in “backup storage” because the original version was now permanently deleted (hence no longer a “backup”). Do those parts of the opinion introduce any significant change into the SCA’s protections for other (private) communications in cases where there is only one copy, or where a service provider has deleted the messages but retains one copy?

This will be interesting to watch play out. Of course, any case is specific to the facts of that case. Thus, what is and is not considered permanent removal of content versus backup storage in other contexts will probably be getting some attention in courts. 

As you noted earlier, the court emphasizes that “much of the content The Gambia seeks was posted publicly” before it was removed. This suggests that there is also future work to be done to hammer out the nature of content that falls into a grey zone between public and private information (see fn. 14). Where exactly information falls along that spectrum will ultimately dictate which exceptions apply to the SCA’s ban on sharing content. 

7. Could this decision incentivize companies not to hold permanent records or else to use novel encryption technologies or other mechanisms to avoid ever possessing or retaining records of communications?

Absolutely. That’s probably why the court is so laudatory of Facebook’s efforts to store and analyze the content it removed for the potential role that content played in furthering atrocities in Myanmar–the opinion recognizes the potential disincentivizes this decision may have on doing the “responsible thing” before or after violence breaks out, including removing inflammatory content and the accounts of those who foment violence, and analyzing how activities on the Facebook may have triggered or exacerbated that violence. It’s critical that social media companies also analyze the role of algorithmic prioritization and deprioritization of content, and that they are engaged in preventing and accounting for human rights violations and international crimes like genocide that are facilitated by their platforms in order to comply with corporate social responsibility norms. No one will ever have as much relevant data as the companies themselves and therefore it is in the public’s interest that the companies play a major role in the detection of and response to how their platforms are being used to commit atrocities.

8. If you were an investigator on the January 6 committee, what opportunities would this ruling open in terms of obtaining information that one might otherwise think is protected by the Stored Communications Act? 

There are a lot of ways to limit the applicability of this case to the January 6 inquiry based on both the underlying facts and law. First, this decision centers on a request by a foreign government, not a domestic entity. Second, investigations by members of our executive branch have a different purpose than and are subject to different rules than investigative processes triggered by Congress, as Elizabeth Goitein spells out in her recent Just Security article on preservation requests made by Congress to telecommunications and media companies related to the events of January 6. According to her analysis, there are quite a few grey areas with regard to the scope of congressional authority to access the records that are the target of its preservation request, including the scope of the applicability of the SCA to such requests. Much of her analysis concerns congressional authority to obtain private communications content (rather than logs etc that do not involve content). The court’s reasoning in The Gambia case could be persuasive with regard to how to define the boundaries between private and public communications and whether the SCA would protect information in permanently deleted accounts. 

In addition, as discussed earlier, the Court swept quite a bit of information that some might consider “private” into the “public” exception to the SCA. However, a significant swath of the content at issue in the January 6 context–such as instant messages–would likely fall outside the consent exception. Even content shared in private groups may be regarded quite differently, if clearly intended to be private because strict admission rules were in place, for example, or posts were seen by a relatively small percentage of Facebook’s users.  

Further, what is being asked for in each situation is quite different. There is an important distinction between authority to make a preservation request and the authority to obtain the information. The human and civil rights interests implicated by requests to preserve content for potential later acquisition (as in the January 6 context) are less extreme than those implicated by requests to access that content (as with The Gambia). One of the arguments we’ve made in the international context is that there should potentially be a wider array of actors empowered to make preservation requests of social media companies–given the acute risk of removal if the content is particularly graphic (the very content that might have the greatest value for later accountability), government actors’ relatively frequent complicity in human rights violations and international crimes, and the sometimes quite lengthy delays before international legal investigators are authorized to commence an investigation–while access should likely be limited to law enforcement and other legal investigative bodies given the more acute privacy, security and other human and civil rights risks. 

Finally, what The Gambia case and the January 6 situation have in common is that they shine a spotlight on how critical such digital open source information–and digital communications more generally–have become to the investigation of both domestic and international crimes. The court’s emphasis in its decision on the 1000 posts used by reporter Steven Stecklow to illustrate how Facebook was used by Myanmar officials to advance genocide in Myanmar (posts that our Investigations Lab team helped identify) underscores the importance of advancing the jurisprudence related to the use of such content.

9. How might this decision affect the way other governments design their technology and communications policies, that is, in the way they decide to protect online communications? What’s the potential influence beyond the United States?

The United States has an important role to play in setting thoughtful precedent for the handling of digital content. Although U.S. decisions aren’t binding on other jurisdictions, they can be persuasive and can help to set global norms and expectations. Right now, a significant percentage of the major social media companies are based in the United States and subject to U.S. law. But of course there are an ever-increasing number of social media companies that reside outside of the United States, and U.S. control over social media content will likely become proportionately less with time. We really need to be thinking about the precedent the United States is setting for other countries and making sure that we continue to support and foster strong privacy laws and norms domestically, with clear due process protections when exceptions are made. Data localization laws are another player in this space, of course. The trend toward localization may be further exacerbated by decisions that compel disclosure of social media content in ways that run counter to foreign governments’ interests–as here, where Myanmar officials will probably dislike the disclosure of any content that reveals complicity in genocide.