Facebook recently published proposed bylaws for its Oversight Board, an independent body that will review Facebook’s removals of posts. Last year, we wrote about the charter for the board, anticipating that some open questions would be answered when the bylaws were released. The formal legal structure of the board is now clearer: it will be administered by an LLC, established by a trust funded and appointed by Facebook. The credibility of the Oversight Board hinges on its independence from the company – and by extension the trust, which will be made up of one corporate trustee (Brown Brothers Harriman Trust Company of Delaware, N.A.) and at least three individual trustees, whose identities have not yet been disclosed.

The board is set up to operate without interference on the critical issues of selecting and deciding cases, although initially it will review only a narrow set of issues. But the trustees and the company itself retain authority over other matters that underpin the board’s independence, such as its budget and the appointment and removal of board members, a degree of power that could be exercised to influence the direction of the board. Given how central the Oversight Board has become to Facebook’s strategy for addressing content moderation issues, however, there is little reason to think that the company intends to hamstring its creation. Any attempts would, in any event, create enormous backlash if they came to light – an outcome that the company surely wants to avoid. At the same time, some of the consequential issues discussed below would likely never come to light.

Case Selection and Decision-Making

Building on the charter, the bylaws give the board control over case selection and decisions within the limits set by Facebook. They make clear that the trust “will not have a role in reviewing cases or interfere with the board’s exercise of its independent judgement.”

Each case identified by the board’s case-selection committee will be assigned to a five-member panel, four picked randomly from the board at large and one “from among those board members who are from the region which the content primarily affects.” Board membership must encompass seven regions specified in the bylaws (the United States and Canada; Latin America and the Caribbean; Europe; Sub-Saharan Africa; the Middle East and North Africa; Central and South Asia; and the Asia Pacific and Oceania).

But the decision on which “region …the content primarily affects” could raise complex issues. For example, if a post originated in the United States but was about a politician in Central Asia, which region would be primarily affected? A panel’s decision must be reviewed by the full board, a majority of which can require its re-examination by a new panel. Automatic, en banc review of every case should promote uniformity amongst panel decisions, but it will also slow decision-making, which could undermine the board’s efficacy.

The board’s jurisdiction is quite narrow. In its initial phase, it is authorized to hear appeals from individuals objecting to the removal of individual pieces of content for violations of Facebook’s Community Guidelines. It cannot hear cases about account suspensions (e.g., those related to the Soleimani drone strike) or where individuals object to posts that Facebook declines to remove (e.g., manipulated videos of House Speaker Nancy Pelosi).

Content that Facebook has ultimately allowed to remain on the platform will be reviewable “in the future,” and depending on “Facebook’s technical and procedural improvements,” along with a broader swath of Facebook content, including groups, pages, profiles, advertisements, and content that third-party fact checkers have rated false. In order to request a review by the board, a person must have an active Facebook or Instagram account, which suggests that there are no immediate plans for allowing appeals of account suspensions.

Limiting the board’s jurisdiction in the first instance to takedowns makes sense for those concerned about Facebook’s power to suppress speech and the charter’s foundational commitment to freedom of expression. It is also practical: setting up a new institution is no small undertaking and will encounter many unanticipated hiccups along the way.

At the same time, it leaves the board out of many of the controversies roiling our societies right now, from hate speech to election misinformation to “deep fakes.” Facebook may well take these cases to the board itself, but that option – like the potential future expansion of the board’s jurisdiction – leaves the company as the sole arbiter of whether and when its responses to these issues will be subjected to board evaluation.

The Oversight Board is also banned from hearing cases where “the underlying content is criminally unlawful in a jurisdiction with a connection to the content (such as the jurisdiction of the posting party and/or the reporting party)” and where a board decision to allow the content to remain on the platform “could” lead to “criminal liability” or “adverse governmental action” against the company or the board. Facebook has justified this carve-out on the grounds that it cannot give the board more power than the company has itself under the law.

But, as explored in this earlier post, this is not a satisfactory explanation.

Local law may require a company to take down or leave certain posts up, but that is a question of implementation (which Facebook has separately reserved for itself). The board could be given the authority to decide whether or not Facebook has correctly concluded that local law requires the removal of a post and whether the law itself conforms with international human rights law. These types of opinions from the board, even if not implemented, would be valuable in developing norms for issues that currently are determined entirely by the company.

Resources and Information

The board will be financially independent from the company for its first several years of operation. The bylaws provide that Facebook will fund the trust for at least six years upfront in the form of an irrevocable grant, and the company has announced an initial commitment of $130 million to cover this period.

Despite this large grant, Facebook retains financial control over the board through the trust, which is responsible for approving the board’s budget, including requesting additional funding from the company. The annual operating budget developed by the board must be submitted to the trust for approval. In reviewing the budget, the trust will, among other things, consider whether it “complies with the board’s stated purpose,” allowing room to make fairly open-ended judgements.

The bylaws contain important commitments by Facebook to provide information to the board, but place people appealing takedowns at a significant disadvantage. Facebook will provide the board with the policy rationale for its decision, a graphic of the content in question, where it was posted, who posted or reported the content, and the case history. The board can request additional information – for example, relating to the reach of the contested content and the prevalence of similar content on the platform – but it is up to the company to decide whether to provide it. Obviously, appellants are authorized to make submissions explaining why their content should not have been removed. But they are handicapped because they are not entitled to see Facebook’s policy rationale for the takedown (they may get to see these “in the future, once technically feasible”). And, unlike the company, appellants are restricted to a single initial submission.

Selection and Removal of Board Members

The initial slate of board members will be selected by Facebook and the co-chairs of the board, who are also appointed by the company. Thereafter, identifying new board members “will become the sole responsibility of the board’s membership committee.” The committee will prepare a slate of recommended candidates; candidates who receive a majority vote of the full board will be presented to the trustees for “formal” appointment.

Characterizing the trustees’ authority as “formal” suggests that it is limited in some way, but the extent of this limitation is not clear. The bylaws say that the trustees can “appoint or reject” candidates and require the trustees to ensure “that the board maintains geographic balance,” suggesting that they are not meant to simply rubber stamp the board’s recommendations.

The bylaws and charter also give the trust a greater role in member removals than in appointments. The documents seem to preserve the integrity of the board’s decision-making by stating that its members “will not be removed due to content decisions they have made,” but only “if the trustees have determined that that member has violated the code of conduct.” The board (by a two-thirds vote) can recommend removal, as can the board’s director of administration and the public.

Unlike for appointments, the removal power of the trustees does not seem to be qualified by the term “formal,” and individual trustees may also trigger the removal process based on violations of the code of conduct. The requirement that removals be based on the code of conduct means that the trustees’ discretion in this regard is not unfettered, but the code itself includes several broad grounds for removal. For example, it has an expansive “morality” clause as well as broad prohibitions on the appearance of conflict of interest or acting in ways that are against the interest of the board. While these far-reaching grounds for removal provide avenues for addressing unexpected situations, they could also be misused.

Amendments

The process for approving amendments to the bylaws give Facebook and the trust control over several issues that could significantly impact the board’s functioning, in some cases without board approval. For example, the trust can amend the section of the bylaws governing its role in the appointment and removal of board members, which is key to insulating the board from the company’s influence, with only the approval of Facebook and the trust’s corporate trustee.

In another example, so long as it does not conflict with the charter or remove a previously granted authority to the board, only board consultation, not approval, is required for Facebook to amend the section of the bylaws governing case-review timelines, the types of content the board can review, and appeal-submission procedures. It is not clear whether the board or Facebook gets to decide whether a change removes authority from the board or conflicts with the charter, which could well be highly contentious issues. These authorities give Facebook the ability to impact the Oversight Board’s practical functionality at any time, raising questions about its long-term independence.

The bylaws, along with the charter, establish an institution that is calibrated to balance various interests. These interests include Facebook’s oft-stated worry that it is making decisions that impact free speech without any outside check, the company’s interest in demonstrating concern for its users at a time when it is under pressure on many fronts, and crafting an institutional structure that both provides the Oversight Board with legitimacy but protects the company’s interests. But under the charter, the bylaws are not final until they are approved by the board itself, and board members may well balance these interests differently than Facebook.

IMAGE: Protesters with the group “Raging Grannies” hold signs during a demonstration outside of Facebook headquarters on April 5, 2018 in Menlo Park, California. They were calling for better consumer protection and online privacy in the wake of Cambridge Analytica’s unauthorized access to up to 87 million Facebook users’ data. (Photo by Justin Sullivan/Getty Images)