After months of consultation, drawing up policies and setting up legal arrangements, Facebook is about to unveil its Oversight Board for content decisions. Although Mark Zuckerberg initially spoke of the board as a “Supreme Court” of sorts, it is of course a self-regulatory body with global impact, not a public institution. The board may be the most interesting development in social media self-regulation since the 2008 Global Network Initiative on internet censorship and privacy rights, and since Google began the practice of publishing transparency reports in 2010.
As part of Facebook’s system for monitoring and regulating content posted on its global platform, the board is subject to many of the criticisms that are typical for self-regulatory models, especially questions of mandate and accountability. But such structures also can have advantages, and like transparency reports and GNI, this board could prove to be a promising exercise.
The first concern, mandate, is an issue when the objectives of a self-regulatory system are drawn up by an entity lacking the “democratic legitimacy” that a statutory or constitutional authority might carry. Facebook’s content-moderation system uses rules and standards that did not come from an elected legislature, or a Supreme Court or another source that carries democratic legitimacy. In issuing its statement of values and community guidelines, the company has already set out the rules the board will implement, and it has done so in a unilateral manner that undermines those very objectives.
This triggered understandable concerns that the company is defining for itself its duties toward the public. The criticism might have been avoided or mitigated by importing existing democratically legitimate goals from a statute or another source considered at least relatively legitimate in ideological terms such as international law. Given the global application of these rules, international human rights law is the only source of norms that might be acceptable to the majority of Facebook’s users.
Some of Facebook’s values do map onto international human rights, and the company does make explicit reference to these rights. Some of the values also reflect principles contained in several democratic constitutions. However, other values, like authenticity, do neither. By veering off the well-developed path of international human rights standards, however complex and detailed, Facebook also may have left out several human rights norms.
Finally, unilateral control over the mandate also leaves Facebook vulnerable to the critique that it might shift its values and mandate in a self-serving manner, triggering a corresponding change in the objectives of the Facebook Oversight Board. The actual likelihood of self-serving changes may be limited, as a credible board with a reputation to protect may not countenance being strong-armed openly. Once the members of the board are announced, it will be easier for the public to evaluate whether the board is likely to accept excessive, unilateral exercise of power from Facebook, or whether it might gradually reintroduce international human rights into the board’s mandate.
The second criticism of self-regulation is that it tends to come with reduced accountability. It is clear by now that Facebook has the power to regulate a vast amount of content, including public deliberation that is important in a democracy, in a manner that evades accountability through the usual democratic channels.
An optimistic view of the Oversight Board might cast it as an opportunity for Facebook to demonstrate that it is answerable to an expert body that, though dependent on the company for a sustained future and expanded scope, may be independent in thought. Only once the board begins to make – and publicly issue – decisions will we see how much independence it will be able to exercise and whether it will protect the interests of users and hold Facebook accountable. Unfortunately, the fact that the board is allowed, by Facebook’s charter and bylaws for the body, to hear only certain types of cases — and a limited number at that – means its influence over the company’s actions will be correspondingly limited.
Still, the board’s interventions may help Facebook regain some public trust by demonstrating that the company cannot circumvent the board easily.
Facebook’s overall content-regulation system, including the Oversight Board, displays several of the characteristics that make self-regulation appealing: responsiveness, flexibility, greater compliance, and informed and targeted intervention. Facebook may make its own rules, but like many private companies, it is likely to be more willing to comply with them than it might be to abide by external standards, in part because it knows better than anyone else what information it can access to understand a problem and what it is technically capable of implementing.
Facebook, like any self-regulating company, also can be responsive and speedily adjust the rules as necessary when new problems emerge. As an example, the company has been able to experiment and react nimbly to disinformation problems related to elections or the current coronavirus pandemic, demonstrating a flexibility that sometimes can be difficult for government institutions.
Facebook can regulate speech in ways that a state cannot. It can create and implement rules that would violate the First Amendment if they were law. It can reshape the platform’s architecture to implement its decisions – making it impossible, for example, to publish a forbidden image.
This provides an uncomfortable amount of leeway for self-serving actions, or choices that affect Facebook users’ human rights. However, while the board operates within a narrow scope and exercises very limited control over the company’s content decisions, it does compel Facebook to engage in a structured manner with influential external experts. The company can no longer contemplate a change to its community standards without worrying about how the Oversight Board might react.
Watch the Oversight Board Carefully
In the final analysis, based on the bylaws, the Oversight Board’s direct influence may extend to only a small part of the content that Facebook handles. But the board initiates a significant step in social media self-regulation. It is especially interesting because its scope is global and its quest for legitimacy therefore extends to populations who may not necessarily share the same beliefs or practices. The evolution of the Oversight Board will be interesting in its own right, but also as an experiment in corporate governance and self-regulation.