By: Amir Cahane and Yuval Shany
In recent years, law enforcement and intelligence agencies has noted the growing prevalence of encryption usage with concern. Cryptographic measures are no longer limited to the domains of states and sophisticated commercial and financial agents. The increasing availability and user-friendliness of encryption methods, coupled with their default deployment by online service providers, has made common communications less and less accessible to government authorities, even when such access is authorized by law.
This proliferation of encrypted communication methods causes previously accessible intelligence sources to ‘go dark’ and well beyond agencies’ collection capacities. The prevalent use of end-to-end encryption obscures the contents from third parties such as the service providers without introducing ‘back door’ methods that may compromise the services’ security. Existing legal mechanisms, compelling service providers to provide law enforcement and intelligence agencies access to communication content – which remains encrypted – are rendered insufficient.
The ‘going dark’ debate reintroduces various schemes, proposals and suggestions by policy makers in order to regain the agencies’ ability to access the content of encrypted materials. Fresh examples can be found in US DOJ attempts to obtain court orders compelling service providers to decrypt their product and in provisions found in recently enacted UK and Australian legislation, granting national security and law enforcement agencies powers to secretly compel IT companies and online service providers to redesign their products to enable those agencies to spy on users.
Given the prevalence of encryption in our everyday life, the encryption debate, which is usually reduced to balancing between individual privacy rights and security interests, should be more broadly reframed. Encryption may be utilized to ensure the security not only of speech, but also of votes, of autonomous cars, of IoT devices, and of financial transactions – and accordingly formulating a set of encryption rights can provide better insights about the underlying interests protected by encryption rights which extend beyond individual privacy rights. Such interests may encompass IT integrity rights, property rights, political rights and in connection with critical infrastructure cybersecurity threats – even the right to life.
Better understanding of encryption rights, their underlying interests, and their legal footing can enrich the analysis of alternatives to decryption as well as provide critical insights on various proposals to instate government back doors using obfuscating rhetoric of 'responsible encryption', 'front door' or 'Golden key'.
Under the proposed theoretical framework for encryption rights, three separate right can be initially identified: The freedom to encrypt, the right against decryption and the duty to encrypt. All of which are rights in the Hohfeldian sense.[1]
The freedom to encrypt entails that its holder free from any limitations on encryption. Practically, it means that there is no legal prohibition on the act of encryption (in Hohfeldian terms, there is no duty not to encrypt). Where there is no freedom to encrypt, the remaining encryption rights, which relate to encrypted data and to a positive duty to encrypt might be rendered moot. The freedom to encrypt protects fundamental human rights and interests such as autonomy, privacy, freedom of speech – and to some extent personal security and property rights. We may locate the freedom to encrypt in the hard core of the encryption rights, yet this freedom is not without limits. Malicious encryption, which denies individuals from their access to proprietary or otherwise personal data, should not be subject to the protection of this freedom.
The freedom to encrypt does not entail that right against decryption (i.e., the right that one's encrypted content remains as such and will not be decrypted – either by malevolent actors, such as cybercriminals, or benevolent actors, such as law enforcement agencies). In the absence of rights against decryption, data encrypted pursuant to the freedom to encrypt may be subject to decryption attempts. When decryption is in place and enforced, there are fewer incentives to engage in decryption, and a fewer incentives to engage in a cryptographic arms race by all actors.
Similar to the freedom to encrypt, the right against decryption protects fundamental human rights such as autonomy, privacy, freedom of speech. The right against decryption may also serve to protect proprietary interests either in intellectual property (DRM), in financial holdings (by securing online financial transactions) or in tangible property as well, as emerging cyber threat vectors may result in tangible, actual damages. Encryption blocks malevolent actors from gaining control over IoT devices or autonomous cars, which may result not only in proprietary damages but in bodily harm or loss of life. Furthermore, government mandated back doors or its hoarding of zero days attacks capabilities extend beyond a risks to particular individuals, and pose a threat to the security and integrity to IT systems as a whole.[2] In light of this systemic risk,[3] the right against decryption can be understood as encompassing societal interests akin to those protected under the German right to the integrity of IT systems,[4] or data protection rights. However, the deployment of the right against decryption is not limited to the 'going dark' debate, but to other contested issues, such as DRM and fair use policies[5], or to matters pertaining to digital death, and the ability to decrypt data forever locked by the dead.[6]
A third Hohfeldian encryption right is the duty to encrypt, which is the duty imposed on parties to encrypt communications. Data protection rules, such as the European GDPR, may provide an example of how this duty may be employed.[7] This duty can be deemed as complementary to the freedom to encrypt, as the freedom to encrypt does not analytically entail encryption. Interests underlying encryption rights may call for imposing encryption duties on service providers where individuals do not realize the freedom to encrypt, typically due to lack of awareness or technical know-how.
The proposed research paper shall aim to offer a detailed framework of the encryption rights coarsely outlined above, while addressing justifiable encroachments thereupon and tracing any existing statutory grounding of such rights in EU and US law.
[1] See Wesley Hohfeld, Fundemental Legal Conceptions as Applied in Judicial Reasoning (1923); Matthew H. Kramer, Rights Without Trimmings in Matthew H.Kramer, N.E. Simmonds and Hillel Steiner, A Debate Over Rights (1998)
[2] Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter and Daniel J. Weitzner, Keys under doormats: mandating insecurity by requiring government access to all data and communications 1 Journal of Cybersecurity 69–79 (2015)
[3] See also the prohibition against requiring service providers to implement systemic weakness or systemic vulnerability in Section 317ZG of the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 (Cth).
[4] 120 BVerfGE 274 302 (2008)
[5] See, for example Ben Fernandez, Digital Content Protection and Fair Use: What's the Use, 3 J. on Telecomm. & High Tech. L. 425, 452 (2005); Aaron Perkins, Encryption Use Law and Anarchy on the Digital Frontier, 41 Hous. L. Rev. 1625, 1658 (2005)
[6] James D. Lamm, Christina L. Kunz, Damien A. Riehl and Peter John Rademacher, The Digital Death Conundrum How Federal and State Laws Prevent Fiduciaries from Managing Digital Property, 68 U. Miami L. Rev. 385, 420 (2014); Naomi Cahn, Probate Law Meets the Digital Age, 67 Vand. L. Rev. 1697, 1728 (2014).
[7] Under Art. 32(1)(a) of the GDPR, data controllers and data processors :shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data"; see also Gerald Spindler; Philipp Schmechel, Personal Data and Encryption in the European General Data Protection Regulation, 7 J. Intell. Prop. Info. Tech. & Elec. Com. L. 163, [ii] (2016)