In the Facebook Oversight Board’s decision about the indefinite suspension of former President Donald J. Trump’s account, the board found that the suspension was justified, but declared the indefinite suspension indeterminate, standardless, and generally inappropriate. It raised questions about the procedure Facebook followed in reaching its decision about the penalty. Facebook now has six months to reexamine the arbitrary penalty of an indefinite suspension, and return with an “appropriate” penalty corresponding to clear rules for severe violations that apply to all users. The Oversight Board also offered policy recommendations (which are not binding) on the suspension of political leaders’ accounts more broadly, suggesting that if a head of state or high government official repeatedly posts messages that pose a risk of harm under international human rights norms, Facebook should suspend the account.

Among the most remarkable features of this decision is that Facebook declined to answer several questions put to it by the Oversight Board. This included questions about the visibility of Trump’s content as a result of Facebook’s newsfeed and other features, and whether political officeholders had contacted the company about the former president’s accounts. It also declined to share information about suspension of accounts of other political figures and removal of other content, which some may argue is necessary if the Oversight Board is to offer useful and granular policy guidance about Facebook’s treatment of political leaders’ accounts, or to examine whether Trump was treated differently from other political leaders. The other notable feature is the dissenting voice of a minority of the board members threaded through the decision. (More on that below.)

In this essay, I examine the Oversight Board’s decision in the context of the options before it. I first offer background on the Trump Ban decision and its global significance, and then discuss how broadly or narrowly the Oversight Board might have read the questions before it. This approach follows my analysis in a law review article, in which I argue that choices made when framing the problem will be critical to how the Oversight Board is perceived. Next, I discuss the Oversight Board’s decision, the questions it answers, and the ones it leaves open. Finally, I highlight the board’s recommendations about political leaders’ speech, and the alternative point of view put forward by a minority of the board’s members.

Crucially, we have no answer yet to the question of how Facebook will treat political leaders who incite violence in future, or how far the controversy surrounding its opaque and seemingly-arbitrary and potentially self-interested decisions about such choices are justified. We do know that the Oversight Board thinks that Facebook ought to follow human rights norms, including the Rabat Plan of Action, in making these decisions, consistent with the board’s prior decisions in which it has referenced the Rabat document and international human rights law. The Oversight Board has also urged the company to publish the process that it uses for decision-making. Unlike the board’s binding decision about the indefinite suspension of Trump’s account, its policy guidance is advisory, which means Facebook is not required to comply. However, this guidance does recommend a standard and calls for transparency. Whether Facebook decides to comply will speak volumes about its commitment to human rights.

Suspending the Former President’s Account

Donald Trump’s Facebook account was suspended initially for 24 hours and then “indefinitely” following two of his posts made while the Jan. 6 attacks on the U.S. Capitol were underway (Tess Graham’s piece offers a detailed account). The suspensions were for violation of the company’s Dangerous Individuals and Organizations (DOI) policy, which prohibits praise, support, and representation of highly violent events that Facebook designates as such. Facebook described this suspension as having taken place under extraordinary circumstances in its referral to the Oversight Board.

In the period in which the Oversight Board was considering this case, the board expanded its remit from reviewing only cases in which content is removed, to cases in which content has not been removed but users wish to contest Facebook’s decision to leave it up. This means that if the company should choose to leave a political leader’s post up after it has been reported for violation of its community standards, the Oversight Board can now reverse this decision in response to a user’s appeal. It is worth noting that account suspensions do not currently fall within the Oversight Board’s appellate jurisdiction, but the board was able to review Trump’s suspension because the case was referred directly to it by Facebook.

In addition to the questions the suspension of Trump’s account raised in the United States, it is of interest to people across the world. Globally, there’s significance in what this decision means for how Facebook will treat political leaders. The involvement of a political leader and head of state makes this case different from the Oversight Board’s other decision on the Dangerous Individuals and Organizations (DIO) policy.

Interestingly, Facebook told the Oversight Board that it never applied its “newsworthiness” exception for influential individuals to Trump, but used the “cross-check” system that it applies to some high-profile accounts to minimize the risk of errors in enforcement. According to the company, this system is different but not more permissive. This means that all the posts by Donald Trump that remained on Facebook were found to be compliant with its community standards.

Questions Asked and Answered

The Oversight Board decided on two broad questions. The first was whether the indefinite suspension of Trump’s access to Facebook and Instagram was correct, and the second was a policy guidance request about the suspension of accounts of political leaders. These questions might have been constructed narrowly or broadly.

A narrow construction might involve examining the suspension of Trump’s account to see whether it was justified for the reasons offered by Facebook, as opposed to other reasons which might also have applied. It might focus entirely on the actual penalty – the indefinite suspension – instead of on what an appropriate penalty ought to be. A narrow construction might also focus on a limited set of events, the two posts that were removed this year and the attack on the Capitol for example, instead of older posts and events that might have justified the suspension; and on Trump exclusively, without considering how other political leaders are treated in similar circumstances. Indeed, Facebook’s refusal to answer some of the questions put to it by the board made an expansive construction difficult where additional factual information was necessary for a decision.

A broader construction, which a minority seemed to favor, might have involved the board’s taking an expansive view of the questions referred, and considering how the situation should have been evaluated in view of Facebook’s values and community standards. As a part of this approach, it might have accounted for Trump’s history. It might have set forth a clear standard for how Facebook should evaluate similar situations. The expansive view might have clarified a standard that the Oversight Board expects Facebook to apply to political leaders’ incitement, encouragement or endorsement of mass violence in the future. It would have offered much-needed clarity on when Facebook’s values must take precedence over other priorities and concerns. While the policy guidance does offer strong rights-protective recommendations with great signalling value – that part is optional for Facebook. We will only know after Facebook responds, whether and how far the company intends to comply.

The Oversight Board evaluated the account suspension on Facebook’s terms, in view of the DIO policy, without discussing how other community standards might have applied. Although the decision recognizes that other community standards like the Violence and Incitement Policy might have applied to Trump’s posts, the board refrained from commenting on them in the binding part of its decision. The binding part of the decision focuses on the process for determining the penalty, leaving us without enforceable rules from the Oversight Board on how Facebook should treat incitement to violence by political leader, given its policies and values. More detail is forthcoming in the policy guidance, showing that the Oversight Board believes that international human rights norms should be reflected in these decisions. It is here, for example, that the Board says, “Facebook should recognize that posts by heads of state and other high officials of government can carry a heightened risk of encouraging, legitimizing, or inciting violence – either because their high position of trust imbues their words with greater force and credibility or because their followers may infer they can act with impunity.” That’s an important marker for the treatment of government officials worldwide, but, once again, in the non-binding section of the Board’s analysis.

Overseeing Policy on Political Leaders’ Speech

The Oversight Board looked into both the substance and the process of Facebook’s decision and came to the conclusion that while Facebook was right to suspend Trump’s accounts, it had not followed a clear, published process to determine the penalty. On the question of process, the board did not mince its words, writing that Facebook seeks to “avoid its responsibilities” by “applying an indeterminate and standardless penalty and then referring this case to the board to resolve.” In doing so, the Oversight Board confirmed what the public suspected but could not verify: Facebook’s process for making these decisions appears to be uncomfortably arbitrary (as highlighted by the Oversight Board’s Content Director Eli Sugarman in his excellent Twitter thread about the decision, and by co-chair of the board Michael McConnell in his interview on Fox News Sunday).

One thing that this decision clarifies is that the published “newsworthiness” policy was not the reason that twenty of the former President’s posts flagged for violation of community standards remained online. Facebook’s explanation for this is an entirely different, obscure “cross-check” system. The process subjects some “high profile accounts” to additional internal review through which content initially marked as violating the community standards seems to frequently be found not to be in violation after all (although Facebook told the board that this system is not more permissive than the system employed for other users). This suggests that criticism directed at the company for arbitrary decision-making about high-profile accounts is well-founded. Facebook also refused to share information about the suspension of other political leaders’ accounts with the Oversight Board.

It is clear that the Oversight Board is dissatisfied with what it knows about Facebook’s process for making these decisions. The binding part of its decision requires Facebook to follow a set process for choosing a penalty for the Trump case and, more broadly, for severe violations of its content policies. It has given Facebook six months to develop a new process, which must be “clear, necessary and proportionate.” This is significant, since it means, in theory, that Facebook will have to cede the freedom to make arbitrary decisions about the penalties that apply to high-profile political leaders for violation of its content policies.

The majority of the board decided against including the criteria for Facebook’s human rights responsibilities in the part of its decision on whether to uphold the indefinite ban, which means that the criteria offered are not binding. The minority wanted to add these criteria, such as whether reinstating Trump’s account might result in imminent discrimination, violence or other lawless action, and accounting for context and conditions on and off Facebook. The majority decided that these criteria should be policy guidance, making it optional for Facebook act on them. In sum it appears that if Facebook develops a process that does not offer adequate or reasonable protection to the targets of incitement to violence, the Oversight Board may not be able to review it unless another case implicating the new policy comes before it. (It is an open question whether the board will engage in that kind of review in its transparency reports which promise to assess Facebook’s compliance with the board’s recommendations.)

The Minority and Dissent

An interesting contrast is offered by the voice of a minority of the Oversight Board members who wanted to take a more expansive view of the questions. Although the minority’s opinions have been expressed clearly within the decision, these arguments are not offered on their own terms. Unlike dissenting judgments, which can give us an alternative way to look at a question or problem, these paragraphs hint at alternate readings and hidden depths without exploring these in detail. Since the whole Oversight Board is an expertise-based body rather than a democratically elected one, it might have helped Facebook and the public to hear these out-voted experts’ views on political leaders and dignity-based harms. It might also have been interesting to get a detailed account of why the minority felt that Facebook’s human rights responsibilities should be a part of the decision rather than the policy guidance. Every time I ran into the minority’s views, I wondered what a full dissenting opinion might have said, and why it wasn’t possible to write it. Dissents (like assents) are only allowed anonymously so there is no reason for board members to write one except when they feel strongly about presenting an alternative view.

Several questions raised by the minority members are astute. For example, they have pointed out that context and conditions on and off Facebook are key to making these decisions. In view of this, one might argue that the Oversight Board was given limited information to assess context thanks to Facebook’s refusal to share how newsfeed and other design features impacted the visibility of Donald Trump’s content, or its refusal to offer information about how it makes similar decisions about other political leaders. With limited information, only egregious cases like the storming of the Capitol can be determined to be cases that ought to be taken seriously. It may become difficult for affected parties to argue that a different political leader is dangerous enough to warrant an account suspension or even for removal of dangerous posts. As it happens, it appears so far that the Oversight Board can currently examine account suspension only if asked to do so by Facebook in a referral. Account suspensions, which are among Facebook’s most powerful tool to restrict speech on its platform, do not seem to be within the board’s appellate jurisdiction.


This decision raises more questions than it answers. We have confirmation that skepticism about Facebook’s decision-making about heads of states is justified, but no answer to whether or how the company will change this. We know that Facebook shares limited information with its own self-regulatory body about its treatment of heads of states, and we are left with all the unanswered questions the Oversight Board raised. The opacity of Facebook’s cross-check system, which appears to be the reason that Donald Trump’s account went untouched so long, is not going to inspire trust and confidence in the public. The Oversight Board’s decision focused on the arbitrariness of how penalties for extreme violations of content policies are chosen, which is a worthwhile problem to solve. The problem of heads of states who incite violence stays with us and with Facebook.

Photo credit: Al Drago-Pool/Getty Images