What is Facebook Getting Wrong in Combating Revenge Porn?
Published: August 19th, 2018
In November of 2017, it was revealed that in an effort to combat revenge porn on its three social networks – Facebook, Messenger, and WhatsApp – Facebook is testing a tool to bring some control back to individuals and especially victims of revenge porn abuse. Individuals who shared the intimate, nude, or sexual pictures with partners and are worried these pictures might leak or that their (ex-)partner will distribute the pictures without their consent can use Messenger to send(!) these intimate pictures to Facebook to be “hashed.” This process means that Facebook will convert through Messanger the picture into a unique digital fingerprint. With this unique fingerprint, Facebook can later identify copies of the picture and block attempts to upload the same picture to Facebook’s social networks.
According to TechCrunch, the tool is a result of a team-up between Facebook and the Australian e-Safety agency, a governmental authority tasks with the promotion of online safety for all Australians. Amongst others, on the government side, Australians citizens that are in risk of falling victim to revenge porn can contact the authority, and e-Safety might offer them to send the picture of themselves to themselves through Messenger as a solution. In many ways, this is a form of co-regulation between Facebook and e-Safety as there is a joint responsibility for regulatory design and enforcement, and some of the legitimacy of the regime is drawn from the public-private cooperation. According to TechCrunch, Facebook is also piloting this counter-revenge porn tool in three other countries.
But what is revenge porn? According to Derek Bambauer, revenge porn is the “the practice of disclosing nude or sexually explicit pictures and videos, often along with identifying personal information, of former romantic partners without their consent.” [1] Meanwhile, Danielle Citrone and Mary Anne Franks use the broader concept of “non-consensual pornography,” which means “the distribution of sexually graphic pictures of individuals without their consent. Pictures originally obtained without consent … as well as pictures originally obtained with consent, usually within the context of a private or confidential relationship.” [2] This latter definition is broader as it also includes the distribution of pictures where no prior relationships occur, a vital element of the distribution of the pornographic picture once it left the hands of the original distributor and offender.
The solution Facebook is offering deals with the problem of distribution of pictures using technological solutions. This solution has its strengths as it requires no legislative changes or updating the legal doctrine, as some scholars offer to do. Such solutions would include, for instance, the criminalization of revenge porn [2], using copying as a defense [1,3], changing the legal liability of online service providers [4], or reinvigorating torts [5]. Facebook, on the other hand, takes us into the realms of individual action and the ability of individuals to preempt the distribution of their private pictures. There is one bug in the Facebook solution. By sending their most private pictures to themselves, Facebook is asking individuals to upload the picture to Facebook’s networks to prevent other users from doing the same thing. However, if hashing to block future distribution is the offered solution by Facebook, there is an even better solution, one that has some basis with the regulatory concept of a nudge.
According to Thaler and Sunstein, a nudge … is any aspect of the choice architecture that alters people's behavior in a predictable way without forbidding any options or significantly changing their economic incentives [6]. To count as a mere nudge, the intervention must be easy and cheap to avoid. Academics find, and policy analysts recommend, nudge in the policymaking of education and health amongst others. Online, though, nudging occurs in different ways. Facebook (the company) in fact is a master of nudging when it comes to using the user-interface of Facebook (the social network) to nudge users to share their information. For instance, in the past, Facebook nudged users to associate their “about” information with existing pages. Facebook required this change to enable another form of linking and creating connections amongst users’ information. For instance, no longer does a “mere” text says I am a Ph.D. candidate at the Hebrew University, but rather a link to the page of Hebrew University (if it was not yet created) would now associate the Hebrew University with my profile. Though this page, Facebook can now link my profile with other students and workers of the Hebrew University. In contrast, at the time, if I was not willing to change the text into a link to the page, I had to remove the text with the information altogether while constantly be reminded (or nudged) by Facebook to the option to create the link. However, my recommendation for Facebook is to learn from an even more experienced “nudger” – Microsoft and how it has been nudging on Windows.
People might not be aware of the fact, but Microsoft has been nudging its consumers’ behaviors for years. Anti-trust and Internet scholars and professionals are familiar with the more extreme anti-competitive behaviors of Microsoft with such software as Internet Explorer. But Microsoft, in fact, had many smaller nudges. Microsoft nudges users into saving documents in the “my documents” folder while installing their software in the “program files” folder. Microsoft also nudged users to use its software and standards. Moving to a software as a service model for Microsoft haven’t changed this nudging practice. While users might choose to save their documents and images on any folder, now Microsoft offers to save files under the OneDrive folder, the only folder that users of the Office 365 can enjoy the automatic saving of files. Google does the same practice with its folder for it cloud service – Google Drive. Google associates its Google Drive with the rest of the Google services including Android and Gmail thus enable easy access from all platforms.
Going back to Facebook, and its counter-revenge porn tool. Learning from the nudging practices on Windows can teach of one nudge that can make a difference. Instead of sending intimate pictures to Facebook through its Messenger service, Facebook can develop an app with a designated cloud-like folder. Instead of uploading files to the cloud, the folder will hash pictures on the user’s end (i.e., on the computer or smart-phone), and that hash could be automatically uploaded to Facebook’s services. Facebook can also have an app that will enable users to signal which folder holds the pictures they want to hash. These pictures can be intimate or not. While Facebook’s current tool might under-produce results as some users won’t be willing to send pictures unless they fear to be victims of revenge porn, the offer change, in fact, might lead to more pictures being hashed automatically. Users will be able to upload to that folder any picture they want, intimate or not, and they will know it will be hashed and other users would not be able to upload them. It won’t be different than other nudges that occur on other operating systems. True, this means that the app might over-produce hashes, including non-intimate pictures or pictures their creators don’t want to see online (e.g., copyrighted or criminalizing pictures). However, at a starting point, nudging users to signal software to hash pictures on the users’ end and sending these hashes can lower changes of revenge porn using technical means.
[1] Bambauer, Derek. 2014. "Exposed." Minnesota Law Review 98: 2025-2102.
[2] Daniel Keats Citron and Mary Anne Franks. 2014. "Criminalizing Revenge Porn." Wake
Forest Law Review 49: 345-391.
[3] Amanda Levendowski 2014. "Using Copyright to Combat Revenge Porn." NYU Journal of
Intellectual Property and Entertainment Law 3: 422-446.
[4] Danielle Keats Citron, Revenge Porn and the Uphill Battle to Pierce Section
230 Immunity (Part II), CONCURRING OPINIONS,
http://www.concurringopinions.com/archives/2013/01/revenge-porn-and-the-uphill-battle-to-pierce-section-230-immunity-part-ii.html.
[5] Anupam Chander, Youthful Indiscretion in an Internet Age, in THE
OFFENSIVE INTERNET 124, 129–33 (Saul Levmore & Martha C. Nussbaum
eds., 2010).
[6] Richard Thaler and Cass Sunstein, Nudge: Improving Decisions about Health, Wealth and Happiness, Yale University Press (2008)