By: Dafna Dror-Shpoliansky
Three years ago Mark Zuckerberg announced that he wants Facebook to be the social infrastructure that would help humanity become a global community. His vision included a "global supreme court" that would reflect what he defined as 'the social norms and values of people around the world'. The arrival of this "court" did not take long to come, and in October 2019 the Company announced the establishment of the Facebook Oversight Board (the "FOB"), which last month announced its first twenty members.
The FOB jurisdiction includes the authority to make decisions regarding "content that was removed by Facebook for violations of content policies". Cases would be referred either by users (who exhausted the process with Facebook), or by Facebook itself. This means that, at least for the time being, the FOB's jurisdiction is limited to content that was already removed ("takedowns"). However, following criticism over the narrow and perhaps "easy" path that Facebook chose to go, avoiding the more challenging and controversial questions about moderation of content that remains online, the Board indicated that later on they would be able to review appeals on requests for content removals. Another hot potato that Facebook refused to leave in the hands of the FOB, for now, is Fake News. Yet, the FOB Bylaws do leave a place for future expansion of its jurisdiction, so that it would encompass decisions regarding content that was left online, including decisions regarding content that was rated false.
In addition, the FOB will also publish 'policy advisory opinions' on Facebook's content policies. Granting the FOB such an authority attaching it the power to shape public policy through issuing policy recommendations, though the advisory opinions will not be binding, unlike its decisions in concrete content moderation cases that would be granting a binding status on Facebook (unless they violate the law of a state in which it operates).
According to its official statements and documents, the Board is an independent body. Funded by a 130 million dollar trust, its operations and mechanism are separated from Facebook. Its 40 members include women and men from a variety of backgrounds (such as journalists, former UN Rapporteurs and public servants). Some names were proposed by the public, and its composition must reflect, according to the Bylaws, a balance across geographical regions.
It should be reminded that Facebook, a private company, has been at the forefront of the transformation and, perhaps, the mutation of the right to free expression in the online realm over the past two decades. But it is only in the last decade, which began with the outbreak of the Arab Spring, and ended with a Human Rights Council report on the role of social media in the genocide and/or crimes against humanity committed against the Rohingya and in Myanmar, that the profound promises and perils of online speech gained exceptional public prominence. And in light of the risks that powerful, ubiquitous social media can pose to democracy itself, the Company, as other social media platforms, was exposed to constant criticism from governmental and non-governmental actors and users on its non-transparent, non-accessible and unsatisfactory ways of dealing with online content moderation, including tackling hate speech and fake news. It became clear that there is an anomaly of sorts between Facebook's enormous impact on shaping public opinion on the one hand, and the lack of effective regulation and accountability on the other hand. As a result, its overall influence on freedom of expression remains controversial.
The UN Special Rapporteur on Freedom of Expression called on companies to establish "robust remediation programs" as well as a meaningful public participation and transparency in content policies and decisions. That said, the Rapporteur emphasized the importance of tying content moderation, in particular hate speech rule setting, to international human rights law. The Rapporteur explained that global and well-established international norms could provide an appropriate framework for global companies and users to communicate.
While the Board clearly stated it would "pay particular attention" in its decisions to international human rights norms, its decisions would at the same time be based on, what is defined as "Facebook content policies and set of values".
Mark Zuckerber, Building a Social Community , Facebook (Feb. 16, 2017) https://www.facebook.com/notes/mark-zuckerberg/building-global-community/10154544292806634/
Ezra Klein, Mark Zuckerberg on Facebook’s Hardest Year and What Comes Next, VOX (April 2, 2018) https://www.vox.com/2018/4/2/17185052/mark-zuckerberg-facebook-interview-fake-news-bots-cambridge
Oversight Board Bylaws (January 2020), Article 3(1.1.1).
 Evelyn Duek, Facebook's Oversight Board Bylaws: For Once, Moving Slowly, LAWFARE (28 January 2020), https://www.lawfareblog.com/facebooks-oversight-board-bylaws-once-moving-slowly.
 Oversight Board Bylaws (January 2020), Article 3(1.1.2).
 Oversight Board Bylaws, Section2, Article 2.3: The board’s resolution on each case will be binding on Facebook, unless implementation of a resolution could violate the law, while the policy advisory statement from the board will be considered as a recommendation.
 Oversight Board Bylaws, Article 1.4.1.
 Human Rights Council, Report of the detailed findings of the Independent International Fact-Finding Mission on Myanmar, U.N Doc.A/HRC/39/CRP.2, 339-342 (Sept. 17, 2018).
 Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Rep. ¶59, U.N Doc. A/HRC/38/35 (April 6, 2018).
 SR Freedom of Expression 9.10.2019, ¶58(b).
 Facebook Oversight Board Charter Article 2(2).
 Jeffrey Jowell, Courts and the Administration in Britain: Standards, Principles and Rights, 22 Isr. L. Rev. 409, 417 (1988).