"The greatest sources of the century", as the New York Times called the documents that Frances Haugen, former product manager at Facebook, have recently leaked. Haugen revealed internal company documents indicating failure on the part of Facebook to address serious human rights concerns caused by the platform, while possibly promoting its economic interests at the expense of protecting the human rights of its users.[1]
The "Facebook papers" contain research reports, records of employee discussions and draft presentations to senior management, all of which indicating how Facebook officials, "including the chief executive himself", were well aware of the platform's harmful effects but could not, or would not, address them.[2] Haugen's revelations were first out in a series of articles titled "the Facebook papers", published in September by Jeff Horwitz of the Wall Street Journal; last month, she revealed important information about Facebook’s harmful strategy in handling incitement and harmful online content in her testimony before US senators, and during a preliminary hearing before the UK government.[3]
Among the many disturbing issues outlined in the documents Haugen revealed, stood out prominently the "Xcheck" file, relating to Facebook's shielding system regarding online content moderation. Apparently, when it came to "selected members" of Facebook's community, the company did not enforce its own standards. What initially started as a "cross check" system that Facebook developed to prevent its AI-based content moderator from wrongfully removing posts or photos of "high profile people", developed within time into a "white list", exempting, altogether, high-profile users from the Facebook terms of use. Soon after, influential profiles abused the "Xcheck" system to post hate speech, incitement to violence and fake news, while Facebook turned a blind eye to it. Estimates are that approximately 8 million accounts were 'white listed'. [4]
However, this was not the only issue. According to the reports, the system did not only cause damage to the general public but also to private individual users. According to Horwitz's report, the "white list" shielded some high-profile accounts from removing posts that included revenge porn, the non-consensual publication of a person's sexually explicit images or video. In one prominent case, rather than immediate removal, Facebook attempted to manage the case through alternative means, given that the person who published it was a "white list user". As a result, the nude photo of a woman was reposted 50 million times before the time in which it was removed; this posting was viewed 16 billion times, all without the victim's consent, allegedly despite Facebook’s awareness of both content and context. Contrary to the company's terms of use,[5] the account that posted the photo was never closed. And so, it appears that the code – in this case, 'Facebook's code' - slowly became the prevailing law – a law before which not all users are equal.
This story was just the tip of the iceberg. The "Facebook papers" contain an almost inconceivable scope of alarming, high-profile instances in various subject-matter areas, in which Facebook failed to protect the human rights of its users. The papers show how the platform was used by human traffickers; how armed groups are using the platform to incite violence against minorities; how drug cartels used the platform to train and pay their members;[6]and how the use of Instagram, owned by Facebook, has a devastating impact on teenage girls' well-being.[7] Another prominent 'file' concerned Facebook's involvement in disseminating incitement to violence in India via WhatsApp, also owned by Facebook; according to this report, this activity might have led to deadly religious riots in the country. [8]
No doubt, the Facebook genie has long been out of the bottle. The breadth and volume of the issues exposed in the papers underscore the abnormality of the situation in which one private tech company controls a vast amount of media space, in addition to penetrating every piece and aspect of the everyday life of its users. Facebook’s unique ability to shape multiple aspects of our lives, be they civil, political, social and economic aspects of society, underscore the exceptionality of the legal situation we find ourselves at.
The publication of the ‘Facebook papers’ sheds light on the limited ability of current legal frameworks to secure human rights in the digital age. It also highlighted the pressing global challenges faced by the international community in its efforts to ensure respect, protection and promotion of human rights at this time: a significant gap exists between the context in which the international human rights law framework was developed and the features of the digital age. A significant gap also lies in between the pace at which national and international standards are created and that at which technology advances. The result is a reality of legal vacuums and protection gaps between the online and offline arena. Thus, for example, there are no binding regulations today in international law applicable to private companies that could provide a substantive answer to the concerns the 'papers' raised. Even the 2011 UN Guiding Principles on Business and Human Rights, the most relevant international legal instrument in this regard, do not contain precise provisions ordering a private company to halt product development or use in the market for adversely affecting the well-being of users or democratic values.
The same is true for the protection of whistle blowers: the current legal framework to protect whistle-blowers within international law is sector-specific (focusing primarily on the context of corruption or specific labor conditions[9]) and does not include any protection for whistle blowers reporting on a human rights violations associated with a product developed or used in a privately-owned entity; this means that whistle blowers such as Haugen or Wylie (who revealed the Cambridge Analytica scandal in 2018) currently do not enjoy any legal protection in the event of a lawsuit presented by the 'tech-giants'.
What used to be perceived as a fundamentally different 'arena', relating to cyber-space and its unique features, has effectively become the political reality we currently live in; private companies are becoming the dominant actor pulling the strings, and the mere subjects of rights – the humans – are the current "commodity" in this new "datafied"[10] world. Arguably, in this brave new world – in which we constantly encounter legal gaps and anomalies, the international community can no longer rely on applying existing protective frameworks directly or through constructing analogies to existing norms.[11]
In order to put the genie back in the bottle, we need to develop new legal models or paradigms. Government regulation is important but not enough. There is an urgent need to re-conceptualize familiar legal frameworks and adapt them to the new reality: our virtual identities are one example of something that needs to be reshaped – especially, in anticipation of a metaversed virtual reality. How, or should we even, set apart the virtual persona from the physical one when predictions and virtual representation of individuals impact on the human behavior and self-determination of the physical individuals?[12]
In addition, there is a need to address the question of ownership and control of data and possibly to consider a shift to a hybrid definition of private entities as a way to strengthen human rights obligations of companies; rather than privately-owned resource or a public infrastructure, we ought to consider a movement to product-based regulation. This could potentially include merely certain aspects of a product, similar to the approach taken by the EU, to some extent, in its Draft AI Regulation.[13]
Facebook, or in its current name, Meta, has recently declared that "connection is evolving and so are we". The question is, whether the international legal frameworks protecting human rights, will evolve in time as well.
[2]Jeff Horwitz, The Facebook Files (WSJ, 13.9.21), available at: www.wsj.com/articles/the-facebook-files-11631713039
[3] Jim Waterson and Dan Milmo, Facebook whistleblower Frances Haugen calls for urgent external regulation (The Guardian, 25.10.21), available at: https://www.theguardian.com/technology/2021/oct/25/facebook-whistleblower-frances-haugen-calls-for-urgent-external-regulation
[4] Jeff Horwitz, The Facebook Files (WSJ, 13.9.21), available at: https://www.wsj.com/articles/the-facebook-files-11631713039
[5] Jeff Horowitz, The Facebook Files, Part 1: The Whitelist (WSJ, 13.9.21), available at https://www.wsj.com/podcasts/the-journal/the-facebook-files-part-1-the-whitelist/aa216713-15af-474e-9fd4-5070ccaa774c?mod=article_inline
[6] J.Scheck, N.Purnell and J. Horwitz, Facebook Employees Flag Drug Cartels and Human Traffickers. The Company’s Response Is Weak, Documents Show (WSJ, 16.9.21), available at: https://www.wsj.com/articles/facebook-drug-cartels-human-traffickers-response-is-weak-documents-11631812953?mod=article_inline
[7] G.Wells, J.Horwitz and D.Seetharaman, Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show (14.9.21), available at: https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739
[8] N. Purnell and J. Horwitz, Facebook Services Are Used to Spread Religious Hatred in India, Internal Documents Show (23.10.21), available at: https://www.wsj.com/articles/facebook-services-are-used-to-spread-religious-hatred-in-india-internal-documents-show-11635016354?mod=article_inline
[9] Kafteranis Dimitrios, The International Legal Framework on Whistle-Blowers: What More Should Be Done?, 19.3 Seattle J. Soc. Justice, 729, at 734 (2021); and see for example: United Nations Convention against Corruption (UNCAC), article 33; ILO, Violence and Harassment Convention, 2019 (No. 190), Article 10(b)(iv).
[10] Wendy H. Wong, What happens when we become data? the consequences of datafication, SRI Institute University of Toronto (12.7.2021).
[11] Aulis Aarnio, Paradigms in Legal Dogmatics, in Theory of Legal Science: Proceedings of the Conference on Legal Theory and Philosopy of Science 26 (Peczenik, Lindahl, & Van Roermund eds., 1984).
[12] Julie E. Cohen, Between Truth and Power – The Legal Construction of Informational Capitalism, at 67-68 (Oxford press, 2019).
[13] Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonized Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, COM(2021) 206 final, 21 April 2021, available at: https://digital-strategy.ec.europa.eu/en/library/proposal-regulationeuropean-approach-artificial-intelligence