2.3 Legal Protection by Design
On this page
1 LPbD is a term I have coined 1 to refer to the articulation of legal protection into the prevailing information and communication infrastructure (ICIs), more notably the legal protection provided by fundamental rights2 and the checks and balances of the Rule of Law.
2 LPbD is not equivalent with Lawrence Lessig’s ‘Code as Law’,3 which frames the normative force of computing code in terms of ‘architecture’, next to social, economic and legal norms. This Project is based on the understanding that social, economic and legal norms overlap in various ways, e.g. also highlighting that the extent to which computer code determines human behaviour depends on the affordances of the relevant computing systems.4
3 LPbD should not be confused with techno-regulation,5 which refers to both legal and non-legal, and — in case of the latter — both deliberate and accidental regulatory effects of technologies. Based on the understanding that ‘technology is neither good not bad, but never neutral’,6 technologies have normative affordances that may be part of deliberate design or engineering decisions, aimed to have specific intended effects, though such normative affordances may also be what are often called side-effects.
4 LPbD must also be distinguished from ‘ethics by design’7 or ‘values by design’,8 which is based on the acknowledgement that any design will have normative and possibly moral implications, inevitably embedding certain values, whether or not the designer is aware of this. LPbD aims to incorporate the specific values of fundamental rights and the checks and balances of the Rule of Law into prevailing ICIs, grounding the design in democratic participation and legislation while ensuring contestability as core and actionable values of legal protection.
5 LPbD should also not be confused with ‘legal by design’,9 which refers to a specific type of techno-regulation, whereby legal norms are e.g. translated into code or into the design of computing systems such that compliance become automated or semi-automated. Think of self-executing code as in smart contracts or smart regulations, or data-driven techniques for prediction of judgments deployed to make decisions.
6 This Project takes the position that legal norms cannot apply themselves and require interpretation, thus enabling contestation. This is why we believe that ‘legal by design’ is an oxymoron.10
7 This Working Paper actually aims to explain what legal protection is afforded by a text-driven ICI, thus raising the question how the text-driven design contributes to legal protection
The COHUBICOL Project proposal
In the Cover Page Summery and Abstract we read that:
The intermediate goals are an in-depth assessment of the nature of legal protection in text-driven law, and of the potential for legal protection in data-driven and code-driven law.
The proposal explains the role of the concept of LPbD as follows:
The third concept that will inform the analysis is that of legal protection by design, not to be confused with ‘legal by design’.11 The latter refers to code-driven law that includes its own automated execution, thus conflating legislation, interpretation/execution and adjudication, for instance by way of a blockchain application. With Brownsword,12 I would argue that ‘legal by design’ is an oxymoron, as our current notion of law assumes that we are capable of disobeying its normative force. A ‘law’ that cannot be disobeyed is not law but discipline or administration. Legal protection by design, on the other hand, takes note of the fact that data- and code-driven law have a different normative force than text-driven law, because they can actually force our hand (code-driven law) or predict legal outcome without providing arguments (data-driven law). Legal protection by design obligates those who build the architectures and applications of computational law to develop these systems in ways that reinstate the kind of protection that is pivotal for the Rule of Law: it will for instance require the testability of these systems as a precondition for the contestability of their output (e.g. stipulating open source software); it will require default settings that introduce procedural checks and balances, compensating for inequalities or unfair distributions (e.g. detecting problematic bias in training data or algorithms). Based on the different affordances of text-driven and computational law, the research will thus develop new ways to think about legal protection, aiming to ensure that law’s new modes of existence will not escape the core safeguards of the Rule of Law – even if that means reconstructing such safeguards in the computational architecture of law’s novel technological embodiment by means of legal protection by design.
The role of the concept in relation to text-driven law is further explained:
The foundational impact on legal methodology is validated in part by the intermediate focus on legal protection in text-driven and data- and code-driven law. Developing an understanding, on the cusp of law and computer science, of the type of conditions that must be met for computational law to offer genuine and effective legal protection will contribute to testing and pruning the novel conceptual tools as well as the ensuing new hermeneutics. This way the research will also contribute more concretely to the innovation of legal method, for instance by figuring out how the upcoming legal obligation of data protection by design (as an instance of legal protection by design) can be an effective legal condition for legitimate processing of personal data.
The bold part of the quotation refers to the relevance of the GDPR’s art. 25 that instantiates a legal obligation to engage in ‘data protection by default and by design’. The current proposal of the EU Artificial Intelligence Act has many instances where developers and providers of AI systems are required to build legal protection into their systems, for instance in art. 14.3(a) which requires the provider to ensure built-in human oversight — insofar as technically feasible, and art. 15 which straightforwardly requires providers to ensure that their AI systems have been designed with built-in accuracy, robustness and cyber security. Similarly, the obligation to keep logs for both providers (art. 12, 20) and users (art. 16(d)) can be seen as a typical obligation of LPbD as it contributes to accountability.
This also relates to contributions of the side of computer science:
While further developing the notion of legal protection by design this research will introduce the concept of effective testability for artificial legal judgment and self-executing legal code. Such testability will be investigated as a new articulation of the legal requirement that decisions concerning human beings should in principle be contestable. The investigation into such testability requires close collaboration between legal scholars and experts in data science and encryption. It is, for instance, related to the interpretability problem in machine learning and more generally to the opacity problem in algorithmic decision-making.13
This points to the particular kind of collaboration this project develops between lawyers and computer scientists:
The challenge will be to discuss, test and develop examples of legal protection by design without actually building applications of computational law. This implies that we move into the realm of speculation, through thought experiments and counterfactual exploration, though not in the sense of freewheeling fantasy. Quite on the contrary, building on the findings of the second year, the lawyers will interrogate the computer scientists about the kind of protection that can be designed given the assumptions of machine learning and blockchain applications and the implications they generate.14 They will, for instance, inquire how discrimination aware data mining could resolve some types of problematic bias, or how different ways of gathering training data affects the output of quantified legal prediction, or how Van der Lei’s first law of medical informatics would apply to legal informatics.15 The computer scientists will develop their own questions as well as propositions, for instance, the question of whether attempts to develop applications that give reasons for their output would count as them ‘giving explanations’ in the legal sense, or a proposition for enhancing legal prediction software with different types of algorithms that generate different outcomes for the same training set, thus enabling multi-interpretability. The generic aim of the research in these years is to develop a set of architectural requirements that afford testability, interpretability and contestability. This confirms the normative position this project takes, as such requirements should afford a mode of existence for computational law that respects the central tenets of the Rule of Law where it comes to automated decisions that have a significant effect on the capabilities of the human beings they affect.
Smart Technologies and the End(s) of Law
In chapter 10, I discuss LPbD:16
The argument is that without LPbD we face the end of law as we know it, though — paradoxically — engaging with LPbD will inevitably end the hegemony of modern law as we know it. There is no way back, we can only move forward. However, we have different options; either law turns into administration or techno-regulation, or it re-asserts its ‘regime of veridiction’ in novel ways.
Paradoxically, technology-neutral law requires technology-specific law to achieve such neutrality.
The chapter then teases apart two interactions between law and technology. First it explains about the need for technology-neutral law, and why, paradoxically, it requires technology-specific law to achieve such neutrality:17
In other work we have evaluated the arguments that have been made for technology neutral law by regulators, business and legal scholars.18 The arguments can be grouped together under three different objectives: first, the innovation objective that aims to prevent technology specific regulation as it might unfairly constrict the field or the development of specific technologies, thus interfering with the freedom to conduct a business; second, the sustainability objective that aims to prevent legislation from becoming outdated all too soon, because the changes in the technological landscape make it ineffective with regard to the goal it was supposed to serve; and third, the compensation objective that aims to redress erosion of the substance of a fundamental right which occurs as a side-effect of a new technology.
In this chapter we focus on the compensation objective, to explain why technology specific law may sometimes be necessary in order to sustain the neutrality of the law with regard to emerging technologies. Neutrality means here that the mere fact that a new ICI is emerging should not diminish the substance and effectiveness of legal protection. This aligns with the approach Nissenbaum has developed in her decision heuristic with regard to contextual integrity,19 investigating whether and how a new socio-technical practice infringes existing values. This entails taking a prudent but not a prudish position with regard to norms and values such as privacy or contextual integrity. The approach is prudent in so far as it focuses on existing rights or values, not necessarily advocating new ones. It is not prudish because it recognizes that to defend and preserve these values or rights, their substance and effectiveness must be evaluated in the light of the relevant new technologies, taking into account that the design of such technologies makes a difference for the values and the legal norms they enable or overrule. To some extent, we must accept that a new ICI may induce a reconfiguration of our norms and values; the point is that a reconfiguration should not go so far as to erase the substance of existing values merely because that fits new business models or more efficient administration. From the perspective of law in a constitutional democracy, we can add that legal norms are enacted or condoned by the democratic legislator and changing their scope should not be done without involving the constituency that is at stake.
An altogether different — though nevertheless related — question is whether there can be such a thing as technologically neutral law:20
In Chapters 7 and 8 we saw that modern law has thrived on the affordances of the printing press. Text formats the extended mind of the lawyers; it feeds on an external memory and a systematized archive that comprises of codes, treaties, statutes, case law, doctrinal treatises and theoretical reflection. The proliferation of legal text has invited and enabled abstraction, complexity and systemization. It has generated the need for reiterative interpretation, paradoxically inviting both contestation and authoritative closure. Hesitation, doubt and consideration are situated in the heart of the law, instituting the antagonistic prerequisite for the competence to enact legislation and to issue a verdict. Written law externalizes legal norms, thus making contestation possible and final decisions necessary. This has led me to claim that law is not technologically neutral; its characteristics are contingent upon the ICI that mediates its verdict — its ‘regime of veridiction’ — and its mode of existence. As argued in Chapters 7 and 8, we cannot assume that the ICI of pre-emptive computing has affordances similar to those of the printing press.
This requires a rethinking of the legal embodiment of the law in an onlife world, since we cannot expect to regulate our new world via a law that is entirely inscribed via the ICI of a previous era. Re-articulation of the law in the emerging ICI will be necessary in so far as we wish to re-establish the fundamental rights developed in the era of the printing press. This does not mean that written law can be discarded. On the contrary, the externalization of legal norms that makes them contestable and enforceable should be preserved. But the nature of written law will somehow change. The spoken word did not disappear when we started writing, nor did unwritten law lose its bearing when written law became dominant, though some lawyers may deny that unwritten law has the force of law (law remains an essentially contested concept). What matters here is that the spoken word and unwritten law were transformed by their relationship with text. Before the script the notion of an unwritten law did not exist; before the arrival of ‘the online’ there was no such thing as ‘an offline’. Mozart did not think of the performance of his music as being unplugged. We may expect similar transformations of our dealings with printed matter, due to the impact of pre-emptive computing. In that sense the hegemony of modern law, contingent upon the affordances of printed text, will end once we learn how to integrate legal norms in pre-emptive computing systems. This need not be the end of law if we develop new ways to preserve what differentiates law from administration and techno-regulation.
This then raises the question how this relates to the affordances of text-driven, code- and data-driven ICIs:21
The reader may believe that LPbD is an attempt to apply affordances of the script and the printing press to an ICI that has very different affordances. Such an attempt is bound to fail. Obviously affordances cannot be applied; they can be detected and to some extent they can be tweaked or designed. The attempt is to detect, configure or design affordances that are compatible with specific legal norms that might otherwise lose their force, or to develop socio-technical systems that embody specific legal norms. This should always include attention to the ‘resistability’ and contestability of the ensuing normativity, and should always involve testing how the configuration or design of the affordances can best serve the goals of justice, legal certainty and purposiveness. Developing a methodology of LPbD entails a vertiginous challenge to traditional doctrinal research methods within legal scholarship and to the scientific methods of computer science, requirements engineering and electronics. No one area should colonize another, but LPbD is not a matter of different disciplines exchanging ideas. The point of departure is the task of articulating compatibility with a legal norm into an architecture, protocol, standard, hardware configuration, operating system, App or grid.
M. Hildebrandt, ‘A Vision of Ambient Law’ in R. Brownsword and K. Yeung (eds), Regulating Technologies (Hart 2008); M. Hildebrandt and B.-J. Koops, ‘The Challenges of Ambient Law and Legal Protection in the Profiling Era’ (2010) 73 The Modern Law Review 428; M. Hildebrandt, ‘Legal Protection by Design: Objections and Refutations’ (2011) 5 Legisprudence 223; M. Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Edward Elgar Publishing 2015); M. Hildebrandt and L. Tielemans, ‘Data Protection by Design and Technology Neutral Law’ (2013) 29 Computer Law & Security Review 509. ↩
C.L. Geminn, Rechtsverträglicher Einsatz von Sicherheitsmaßnahmen im öffentlichen Verkehr (Springer Fachmedien Wiesbaden 2014) http://link.springer.com/10.1007/978-3-658-05353-6 accessed 5 August 2015. ↩
L. Lessig, Code: Version 2.0 (Basic Books 2006). ↩
M. Hildebrandt, ‘Legal and Technological Normativity: More (and Less) than Twin Sisters’ (2008) 12 Techné: Journal of the Society for Philosophy and Technology 169; M. Hildebrandt, ‘The Force of Law and the Force of Technology’ in M.R.P. McGuire and T. Holt (eds), The Routledge International Handbook of Technology, Crime and Justice (Routledge 2017) https://www.bookdepository.com/Routledge-International-Handbook-Technology-Crime-Justice-M-R-P-McGuire/9781138820135 accessed 5 July 2016. ↩
R. Leenes, ‘Framing Techno-Regulation: An Exploration of State and Non-State Regulation by Technology’ (2011) 5 Legisprudence 143. ↩
M. Kranzberg, ‘Technology and History: “Kranzberg’s Laws”’ (1986) 27 Technology and Culture 544. ↩
J. van den Hoven, P.E. Vermaas and I. van de Poel (eds), Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains (2015 edition, Springer 2015). ↩
P.-P. Verbeek, ‘Materializing Morality: Design Ethics and Technological Mediation’ (2006) 31 Science, Technology, & Human Values 361. ↩
P. Lippe, D.M. Katz and D. Jackson, ‘Legal by Design: A New Paradigm for Handling Complexity in Banking Regulation and Elsewhere in Law’ (2015) 93 Oregon Law Review 833. ↩
M. Hildebrandt, Law for Computer Scientists and Other Folk (Oxford University Press 2020), chapter 10. ↩
P. Lippe, D. M. Katz and D. Jackson, ‘Legal by Design: A New Paradigm for Handling Complexity in Banking Regulation and Elsewhere in Law’ (2015) 93 Oregon Law Review 833. ↩
R. Brownsword, Rights, Regulation, and the Technological Revolution (Oxford University Press 2008). ↩
B. Lepri and others, ‘Fair, Transparent, and Accountable Algorithmic Decision-Making Processes’  Philosophy & Technology 1. ↩
For some initial work in this vein, see L.E. Diver, Digisprudence: Code as Law Rebooted (Edinburgh University Press, forthcoming). (Note that this reference did not appear in the project proposal.) ↩
F. Cabitza, D. Ciucci and R. Rasoini, ‘A Giant with Feet of Clay: On the Validity of the Data That Feed Machine Learning in Medicine’  Lecture Notes in Information Systems and Organisation 121. ↩
M. Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Edward Elgar Publishing 2015), p. 214. ↩
M. Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Edward Elgar Publishing 2015), pp. 215–16. ↩
M. Hildebrandt and L. Tielemans, ‘Data Protection by Design and Technology Neutral Law’ (2013) 29 Computer Law & Security Review 509. ↩
H. Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford Law Books 2010). ↩
M. Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Edward Elgar Publishing 2015), pp. 217–18. ↩
M. Hildebrandt, Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology (Edward Elgar Publishing 2015), p. 218. See also the related discussion of ‘digisprudential affordances’ in L. Diver, ‘Digisprudence: The Design of Legitimate Code’ (2021) 13(2) Law, Innovation & Technology (forthcoming). ↩