What Kind of Oversight Board Have You Given Us?
The Facebook Oversight Board (the “FOB”) will see you now—well, at least a very small number of a select subset of you.
The names of the first 20 members of the FOB, née “The Facebook Supreme Court,” were announced last week. This marks the culmination of a more than eighteen-month process to set up this unprecedented experiment in content moderation governance and the beginning of the FOB’s journey as an institution.
Characterizing the FOB is tricky. It is court-like in that it will hear appeals from and act as a check on Facebook’s policy-formation and enforcement processes and provide public reasons for its decisions.1 But it will also give policy recommendations,2 and neither its members nor those who appear before it will be lawyers applying the law. It is a private institution fully of Facebook’s own creation, but it has reasonably robust mechanisms to ensure independence from Facebook, which has put $130 million into a trust intended to fund the FOB for at least two three-year terms. It is a global body, but it would be naïve to think that it will be able to settle global speech norms when different jurisdictions have clashed about these for many decades.
Exactly what the FOB is, then, remains to be seen. But at its core it is the most ambitious attempt yet to cut the Gordian knot of platform content moderation: given the public impact of the decisions that platforms make about what content is and is not allowed on their platform, it is unsatisfactory for private, profit-driven platforms to be making these decisions unilaterally and without any accountability. On the other hand, heavy-handed government involvement in speech regulation is always suspect, and the cure to our current woes should not be worse than the disease. The FOB is therefore an effort to find a third, least-worst option.
Over the course of the FOB’s establishment, I have written at length about its purpose and promise, and each of its founding documents. But as the FOB creaks into operation in coming months, a number of questions of institutional design remain unanswered. The initial members are an impressive and serious group that includes former judges, constitutional law scholars, human rights special rapporteurs, a Pulitzer prize winning editor, and a Nobel laureate. Many pieces have been written about these people; they have been criticized from across the political spectrum and called “nonoffensively nonoffensive,” even as many others took plenty of offense. Overall, though, the reception has been positive. But the FOB could be comprised of 40 diverse Dworkinian Hercules Superjudges and it would not matter if the institutional design does not set them up for success. That design is therefore the focus of this post, which assesses the most consequential outstanding questions that need to be answered as the FOB starts to hear cases.
* * *
What kinds of cases will the FOB be able to hear?
By far the most important open question, and one that I have repeatedly if not monotonously written about, is the question of the FOB’s “jurisdiction”—by this I mean which of Facebook’s decisions the FOB will have the power to review. The narrowness of the FOB’s initial jurisdiction is the biggest disappointment in the development of the FOB over the past 18 months. The FOB’s bylaws themselves contemplate a potentially vast jurisdiction, including the power to hear disputes about groups, pages, events, ads and fact-checking.3 This is the “Supreme Court” we were promised. But the bylaws only promise this jurisdiction at some unspecified time “in the future.” When the FOB first begins operations, its jurisdiction is limited to referrals from Facebook and “content that has been removed for violations of content policies” from Facebook or Instagram.4 As I have written previously, this is a significant limitation:
“Take-downs” could appear to be the kind of decision that most threatens free expression. But a number of the most controversial content moderation decisions made by Facebook in recent years have been decisions to leave content up, not take it down: Think of the Nancy Pelosi cheapfake video in which footage of the speaker of the House was edited misleadingly so that she appeared intoxicated, or hate speech in Myanmar, or the years that Facebook hosted content from Infowars chief Alex Jones before finally deciding to follow Apple’s lead and remove Jones’s material.
Limiting the board’s jurisdiction to take-down decisions stacks the deck somewhat. It is like introducing video appeals to tennis to make calls more accurate but allowing players a review only when balls are called “out” and not when a ball is called “in,” no matter how erroneous the call seems. For those in favor of longer rallies—which probably includes the broadcasters and advertisers—this is a win, because only those rallies cut short can be appealed. For those in favor of more accurate calls generally, not so much.
Facebook’s decisions about political ads, events, and groups have also been at the forefront of recent content moderation debates. These areas are at least mentioned by the bylaws, but mention of the way Facebook’s algorithms rank content for display to users is conspicuously absent from the document altogether (despite a tantalizing suggestion in earlier documents that they might ultimately be subject to review). This may seem completely separate from decisions about individual pieces of content, but the two types of content moderation are, as CEO Mark Zuckerberg himself has explained, intimately related. Most worryingly, this provides Facebook with a loophole through which to avoid FOB oversight: Facebook could simply downrank hard cases rather than taking them down completely. Not to mention that many of the main concerns about Facebook’s effects on public discourse relate to the kinds of content that it algorithmically amplifies.
It is worth noting that many of these kinds of cases could come before the FOB from the start if referred by Facebook at its discretion. But this is clearly not a strong model of independent oversight and makes the FOB less “Supreme Court” and more “optional consultant” on many of the most important matters in content moderation. In an op-ed published in the New York Times, the FOB’s four co-chairs said that “over the next months we will add the opportunity to review appeals from users who want Facebook to remove content.” And on a press call in advance of the member announcement, Jamal Greene, one of the co-chairs, said, “The scope of content we review might very well change over time and the board itself can make recommendations along those lines. So that might be a possibility in the future.” A Facebook representative noted that “we share the board’s ambition for its role to grow over time.” These are promising noises, but the FOB’s success depends on their translation into actual institutional reform.
The FOB’s legitimacy as a true check on Facebook requires that it be meaningfully empowered to review the main content moderation decisions Facebook makes—not only a small subset of them that are peripheral to Facebook’s main product. When I first started writing about the project eighteen months ago, I noted that a common tactic of authoritarian regimes is to allow independent courts to persist in order to give the regime a veneer of legitimacy, but deprive them of jurisdiction over the most consequential matters. The FOB and the public should continue to demand better.
* * *
How will the FOB choose cases?
The next most significant determinant of the FOB’s impact is how it chooses which cases to hear out of the millions of possible options generated every day. The bylaws leave this very vague,5 and that is a good thing. This should be something the FOB determines for itself, rather than set in advance by Facebook in its drafting of the bylaws.
Co-Chair Michael McConnell said that the FOB intends to focus on cases that (1) affect large numbers of users, (2) look like they may have a major effect on public discourse, and (3) raise significant policy questions across the platform. These are common sense criteria, but they remain vague and broad and many possible cases will meet them. The FOB’s actual impact could still take many forms, and which particular fact sets are chosen will no doubt significantly influence the precedents it creates.
This leaves a Case Selection Committee, made up of at least three people, to fill in the details and create and document the case selection criteria.6 Because of the sheer volume of possible cases, the power of the Case Selection Committee to determine the work of the FOB as a whole cannot be overstated. A puzzling detail is that Case Selection Committee member terms are only three months.7 It is unclear why Facebook chose this length in drafting the bylaws, but these short terms—combined with the fact that at least part of the FOB’s docket will comprise referrals from Facebook—may make developing strategic and cohesive direction in the work of the FOB a challenge. This could be a substantial hindrance, especially because the need for docket management to be strategic is heightened due to the dialogic nature of the FOB’s role, to which I turn next. Such dialogue will be less effective if one participant in the conversation between the FOB and Facebook cannot sustain their side of the conversation because of ineffective docket management.
* * *
How broad will the FOB’s impact be?
McConnell emphasized that these case selection criteria are intended to ensure their decisions would have an impact not only in individual cases but also across Facebook’s services more broadly. This is an obvious requirement for the FOB’s work to be meaningful, given that, like any ultimate appeals body, it will only review the tiniest fraction of the content moderation decisions Facebook makes. Individual error correction cannot be its raison d’être. However, effective case selection is a necessary but not sufficient condition for this broader impact to occur given the design of the FOB’s relationship with Facebook. The extent of the FOB’s impact will depend largely on Facebook too, particularly its willingness to engage in this experiment in good faith.
This is because, while the FOB’s decision in any individual case will be binding on Facebook with respect to the particular piece of content in question, the decision to apply the FOB’s ruling to “identical content with parallel context” will depend on whether Facebook deems it “technically and operationally feasible."8 It is worth noting that this requirement only applies to identical content “that remains on Facebook”—that is, it only applies in reviews of “leave up” decisions, over which the FOB currently has no jurisdiction. This implies that orders from the FOB to reinstate content will not have broader retroactive effect. The FOB can make policy advisory statements that could affect the platform more broadly, but these “will be considered as a recommendation.”9 In all of these matters, Facebook’s obligation is to publicly disclose what it has done in response to the FOB’s decisions in its newsroom.10
This has been a widely criticized aspect of the FOB’s design, and it is not hard to see why. The extent to which the FOB can actually bind Facebook is very narrow—just the individual content in any case.11 A drop in the ocean of content on Facebook. But I remain convinced that this is overall a positive design choice. The biggest challenges in content moderation involve the rapid pace of change in the online ecosystem, and the difficulty of enforcing any rules at scale. Policies are designed around these constraints. The FOB is not best placed to evaluate these fast-moving operational matters. As co-chair McConnell said, “we are not the internet police. Don’t think of us as sort of a fast action group that’s going to swoop in and deal with rapidly moving problems.” Indeed, content moderation experts see the lack of expertise in these operational matters as amongst the biggest weaknesses in the choice of board members. As such, “binding” Facebook to rulings that are impractical or become swiftly out-of-date would weaken the FOB experiment by making its decisions unworkable, quickly meaningless and inevitably undermined.
What does the FOB add then? The hope is that the dialogue between the FOB and Facebook, through being forced to make arguments in cases and publicly respond to FOB recommendations, will finally ventilate the reasons behind why Facebook makes the decisions that it does by forcing Facebook to justify them. This process itself will hopefully improve decision-making, but at the very least it will provide a level of transparency and accountability that is currently sorely lacking. To those from the United States, the paradigm “strong-form” judicial review jurisdiction, this might seem feeble. But many other jurisdictions have a version of this dialogic “weak-form” review, and it often turns out to be much stronger in practice than it appears in theory.
This process of ventilation is an important step forward. Rather than pretending that all content moderation decisions are decided by high principle, this process could shed light on the very real operational constraints involved, which is a necessary first step in figuring out the best way to work within them.
This depends, of course, on Facebook being a willing partner-in-dialogue, and not merely publicly responding to the FOB’s decisions with the equivalent of “Noted, thanks.” But given the time and effort Facebook has put into the FOB experiment so far, and the reputational cost that observers will impose if Facebook undercuts its own initiative, for now there is no reason to think that it will not.
* * *
What standards will the FOB apply?
The FOB will never apply or interpret any country’s law.12 All cases involving content deemed to be illegal under local law are definitionally excluded from the FOB’s remit.13 Instead, it will “review content enforcement decisions and determine whether they were consistent with Facebook’s content policies and values” and “pay particular attention to the impact of removing content in light of human rights norms protecting free expression."14 That is clear as far as it goes, but actually tells us very little. The central substantive issue for the FOB will be how to synthesize and resolve the tensions between and within each of these bodies of “authority” that the FOB must apply.
Facebook and Instagram’s Community Standards and Guidelines respectively are fairly straightforward rulebooks that lend themselves to familiar (which is not to say “easy”) issues of interpretation. Facebook’s “values” are a more complicated matter. On their face, they appear to present a standard proportionality test—Facebook prioritizes “voice,” but four values may justify limiting such voice: authenticity, safety, privacy and dignity. Defining the ambit of each of these values and the extent they can be used to justify infringing on expression will be the central preoccupation of the FOB. But the values of authenticity, safety, privacy and dignity from Facebook’s “values” are not the same as those which can justify limitations on expression under international human rights law, namely the rights or reputations of others, national security, public order, public health, or morals.15 There is some overlap, but they are not coextensive. Facebook itself says it looks to “international human rights standards” in writing its rules. The FOB’s Charter requires the FOB to “pay particular attention to the impact of removing content in light of human rights norms protecting free expression."16 It is unclear if the omission of the word “international” in the Charter was intentional. A number of board members have considerable international human rights expertise and will no doubt try to imply the word “international” back in front of the phrase “human rights norms” in their guiding principles. Co-Chairs Catalina Botero-Marino and Jamal Greene made references to international human rights values and standards on the press call, and the four co-chairs said in the New York Times that board members are “all committed to freedom of expression within the framework of international norms of human rights.”
This raises many questions for the FOB: How can the balancing demanded by Facebook’s “values” be reconciled with the balancing demanded under the “international human rights norms” that they are also committed to upholding? How will they balance the need to interpret each of these values, as well as any piece of content, in their local context while also interpreting and applying a global rulebook? Most importantly perhaps: How will they balance the competing values in any individual case? When will “dignity” outweigh “voice,” for example?
These are hard questions—indeed, it is the very difficulty of these questions that has led to the FOB’s creation as a forum to engage in public reasoning on matters that a pluralistic society will never fully agree upon. They can only be answered in time and in particular contexts through the FOB’s decisions. There are some encouraging signs that the FOB is already thinking thoughtfully about them, however. Co-Chair Botero-Marino described the importance of making sure to take context into account when applying global rules and added that “this is nothing new: international law has been doing this for the last 50 years.” Co-Chair Greene noted that in his academic work he has “thought a lot about the challenge of how to handle conflicts that involve competing values.” Indeed, Greene has written one of the most eloquent arguments for adopting proportionality balancing in a country famously hostile to the idea, preferring categorical rules instead. The challenge will be whether the FOB can do this in a way that proves balancing skeptics wrong by developing a coherent body of precedent that resolves all these competing authorities and tensions while still providing a measure of predictability and certainty that helps legitimate its decisions.
This would be a daunting task for any institution. It almost certainly cannot be done in the fifteen hours a month that Facebook has said it expects members to commit to their role, while keeping within the ninety-day timeframe Facebook has set for each case from original decision until implementation.17 And not only must each individual board member resolve these tensions to their own satisfaction, but they must also do so in a way with which their colleagues can agree.
* * *
What if they can’t agree?
Diversity in the FOB has been one of Facebook’s priorities from the start, and across many metrics the first twenty members live up to this commitment. A twenty-person (nor forty-, nor 100-person) FOB was never going to be able to be fully representative of all constituencies affected by Facebook, and there are mechanisms in the bylaws for the FOB to call on experts and other stakeholders to represent a range of perspectives.18 But the initial pool of members represents a good faith attempt to bring together a diverse range of viewpoints. That is definitely a strength, but it does raise another question that is not being asked enough: will this group be able to agree on anything? And what happens if they can’t?
Surprisingly, it is not entirely clear. Cases will be heard by five-member panels of the FOB. The panel’s decision will be by majority, and can include concurring and dissenting viewpoints if the five members cannot reach consensus. At this point, the draft decision will be sent to the FOB as a whole (currently twenty members, but up to forty in the future). If the FOB is not satisfied with the decision, a majority of the FOB can decide to send a case for re-review, causing a new panel to be convened and the process to start again. There is currently no mechanism to break the infinite loop that may occur if a majority of the FOB is consistently dissatisfied with panel decisions. Co-Chair McConnell said they did not anticipate the full board overruling a panel “except in the perhaps most important and extraordinary cases,” but this may be an unduly optimistic outlook: rarely do content moderation issues have fewer than twenty possible answers. Establishing a mechanism for resolving these disagreements should be a priority for the FOB in amending its bylaws in advance of such gridlock becoming an issue (to do so, it will need a two-thirds majority and the agreement of Facebook19 ).
Much of the public attention in the process of setting up the FOB has been on whether there will be a breakdown in the relationship between Facebook and the FOB; more should also be paid to the possibility of relationship breakdowns within the FOB itself.
* * *
The FOB will not solve all our problems with social media
There are many valid criticisms to be made of the FOB, well beyond the few I have mentioned here. Anyone who isn’t at least slightly skeptical hasn’t been paying attention to the content moderation saga of the past few years. There are concerning vagaries about the process for removal of members, for example, as well as ambiguous amendment rules that could be a source of conflict. The time commitment that Facebook has apparently asked members to commit to is wildly insufficient for them to issue the kind of thoughtful and well-reasoned decisions that will be essential to establishing the legitimacy and coherence of its precedents. The timelines for decisions are needlessly rigid, and there is no process for third parties to submit arguments unless invited by the FOB. I could go on.
But the least persuasive argument against the existence of the FOB is that it will not solve all our problems with social media. It is true that the FOB will not address Facebook’s ad-based business model, its corporate power, or the changes to the platform’s affordances that are needed to enable victims of abuse on platforms to better protect themselves. It also cannot hire the additional content moderators in languages other than English that are necessary for Facebook to meet its due diligence obligations in emerging markets, nor make sure those moderators are provided with safe work conditions. Coordinated disinformation campaigns remain outside the FOB’s remit. It cannot correct the biases inherent in the artificial intelligence tools that flag most of the content found to be in violation of the Community Standards or stop the decimation of the news industry. It cannot get independent researchers access to the data necessary to study how speech actually moves through social media and the effects of different interventions. These matters and more must remain on the agenda for reform. Finding solutions to these problems will require both better mechanisms of self-regulation as well as hard regulation in the form of laws forcing transparency and accountability.
But the fact that we have many problems should not stop us from attempting to solve at least one of them. Currently, some of the most consequential decisions about the way information flows through society occur behind closed doors with minimal public justification and in a way that is influenced by business imperatives. This is at odds with how essentially every jurisdiction with free speech traditionally thinks about it, which is that any restrictions on speech should be specified clearly in advance, applied consistently, and subject to careful scrutiny. This is the check that the FOB can bring to Facebook’s content moderation ecosystem. The FOB will not be a “special censorship committee”—the supposed “censorship” is already occurring. The FOB’s role is to make sure it can be justified. Indeed, as noted above, on its current limited jurisdiction the FOB will only be empowered to force Facebook to reinstate content, unless and until Facebook decides otherwise. This might be more marginal improvement than revolution, but the amount of time and oxygen that gets spent on these debates every time a controversy arises suggests significant demand for exactly the kind of transparent justification and forum for channeling contestation that the FOB is intended to provide.
* * *
Conclusion
In their New York Times op ed, the four co-chairs acknowledged that “we will not be able to please everyone. Some of our decisions may prove controversial and all will spur further debate.” This is a dramatic understatement. Content moderation issues have dominated headlines for years now, and are always highly charged. These 20 people have signed up to be very unpopular. But they have also signed up for an important attempt to break the impasse we have arrived at where nobody—not the platforms, governments, civil society or users—is happy with the content moderation status quo.
Solving the content moderation puzzle is one of the great challenges of the information age and platforms are the laboratories of online governance in which this challenge is playing out. The FOB is an ambitious experiment in one of these laboratories that should neither be hailed as a comprehensive fix nor dismissed as an inconsequential façade. What exactly it is, and what kind of institution the newly appointed board members make of it, is still an open question. It may be a Supreme Court yet—or a local court, Potemkin court, or ad hoc consultant. The new board members have their work cut out for them if the FOB’s oversight is to be meaningful, but there is no reason not to wish them every success.
- 1Oversight Board Charter, art 5, §1; Oversight Board Bylaws, r 3.2.
- 2Charter, art 1, § 6; Bylaws r 3.1.7.
- 3Bylaws r 1.1.2.
- 4Bylaws r 1.1.1 (emphasis added).
- 5Bylaws r 1.2.1.
- 6Id.
- 7Id.
- 8Charter art 4; Bylaws r 2.3.1.
- 9Charter art 4; Bylaws r 2.3.
- 10Charter art 4; Bylaws 2.3.2.
- 11Of course, in reality, it cannot bind Facebook at all. Despite the fancy names of “Charter” and “bylaws,” there is little that could be done if Facebook ultimately decided to ignore the FOB’s decisions altogether. The FOB has neither the sword nor the purse (Federalist No. 78); it is not the “least dangerous branch” in content moderation, but a positively toothless one. Nevertheless, the entire experiment suggests that Facebook has gone to the effort and expense of setting up the FOB because it believes the legitimacy gains it can derive from so “binding” itself will outweigh any short-term losses from implementing decisions it disagrees with. Whether this turns out to be true is what the whole endeavor hinges on, but this post is an internal critique of the FOB and proceeds on the assumption that it will operate as specified in its founding documents.
- 12Charter art 7.
- 13Bylaws r 1.2.2.
- 14Charter art 2 § 2.
- 15International Covenant on Civil and Political Rights, art 19(3).
- 16Charter art 2, § 2.
- 17Bylaws r 3.1.
- 18Bylaws r 3.1.4.
- 19Bylaws art 5, § 1.