Legal Technology – Preconditions, Opportunities, and Risks

Wolfgang Hoffmann-Riem*

A. Digitalisation as a disruptive innovation

In the previous millennium, there were two great disruptive technological innovations, which led to profound upheavals in society. One was the invention of the printing press; the other, industrialisation. Since the start of this millennium, we have been going through another period of technological upheaval, which many scholars – myself included – believe will cause social change of a magnitude similar to that brought about by the aforementioned innovations. I am referring to digitalisation and the associated digital transformation of the economy, culture, politics, and public and private communication1 – indeed, probably of virtually every area of life. It is essential to prepare for the fact that digitalisation will have a significant and growing impact on society.

Computers and digital technologies have of course existed for many decades. Today, however, their capabilities have changed dramatically. The rate of innovation has accelerated. The creation of global infrastructures has led to worldwide networks, and the mobility of communication – symbolised for example by the smartphone or the use of clouds – creates further space for new developments and changes to individual and social life. Current buzzwords like Big Data, artificial intelligence, robotics, and blockchain are shorthand for further leaps in development.

B. Opportunities and risks associated with the development of legal technology

One of the new buzzwords is legal technology (“Legal Tech”).2 The term describes the use of digital technologies to assist in identifying, interpreting, and applying the law and, in some instances, also in creating it. As a result, traditional law is being practiced in new ways and it is even being replaced in some areas. The role of lawyers is changing.3 We are witnessing the emergence – often in collaboration between lawyers and IT experts – of new forms of legal advice and other legal services, the use of new kinds of expert legal knowledge, the digital analysis of documents, and attempts to predict future judgments by courts.4 Legal technology makes it easier to perform research and evaluate legal sources, for instance as the basis for legal advice or strategic litigation.


* Affiliate Professor of Innovation and Law at Bucerius Law School, Hamburg. Professor Emeritus of Public Law and Public Administration at the University of Hamburg. From 1999-2008, Justice on the German Federal Constitutional Court. The text is a revised version of the keynote, delivered at the Bucerius Summer Program in Legal Technology and Operations, July 1, 2019.
1 On digital transformation, see, e.g., Ahmed Bounfour, Digital Futures, Digital Transformation, 2016; Tim Cole, Digitale Transformation, 2d ed. 2017; Christoph Keese, Silicon Germany: Wie wir die digitale Transformation schaffen, 2017; Oliver Stengel/Alexander van Looy/Stephan Wallaschkowski (eds.), Digitalzeitalter – Digitalgesellschaft: Das Ende des Industriezeitalters und der Beginn einer neuen Epoche, 2017; Reinhard Pfliegl/Claus Seibt, Die digitale Transformation findet statt!, e&i Elektrotechnik und Informationstechnik, 2017, 333–339; Viktor Mayer-Schönberger/Thomas Ramge,, Das Digital: Markt, Wertschöpfung und Gerechtigkeit im Datenkapitalismus, 2017; Arno Rolf, Weltmacht Vereinigte Daten: Die Digitalisierung und Big Data verstehen, 2018; Barbara Kolany-Raiser/Reinhard Heil/Carsten Orwar/Thomas Hoeren (eds.), Big Data und Gesellschaft: Eine multidisziplinäre Annäherung, 2018; Shoshana Zuboff, The Age of Surveillance Capitalism, 2019. From a primarily philosophical standpoint: David Precht, Jäger, Hirten, Kritiker: Eine Utopie für die digitale Gesellschaft, 2018.

2 Examples from the growing academic debate about Legal Tech in Germany: Oliver Raabe/Richard Wacker, Recht ex Machina: Formalisierung des Rechts im Internet der Dienste, 2012; Kyriakos N. Kotsoglou, Subsumtionsautomat 2.0, JZ 2014, 451 et seq.; Yorck Frese, Recht im zweiten Maschinenzeitalter, NJW 2015, 2090 et seq.; Martin Fries, PayPal Law und Legal Tech – Was macht die Digitalisierung mit dem Privatrecht? NJW 2016, 2860 et seq.; Gabriele Buchholtz, Legal Tech: Chancen und Risiken der digitalen Rechtsanwendung, JuS 2017, 955 et seq.; Volker Boehme-Neßler, Die Macht der Algorithmen und die Ohnmacht des Rechts: Wie die Digitalisierung das Recht relativiert, NJW 2017, 3031 et seq.; Jens Wagner, Legal Tech und Legal Robots in Unternehmen und den diese beratenden Kanzleien, BB 2017, 898 et seq.; Christian Ernst, Algorithmische Entscheidungsfindung und personenbezogene Daten, JZ 2017, 1026 et seq.; Annika Klafki/Felix Würkert/Tina Winter (eds.), Digitalisierung und Recht. Bucerius Law Journal Press, 2017; Markus Hartung, Gedanken zu Tech und Digitalisierung, BLJ 2017, 151 et seq.; Mario Martini, Transformation der Verwaltung durch Digitalisierung, DÖV 2017, 443 et seq.; Hariolf Wenzler, Big Law & Legal Tech, BLJ 2017, 157 et seq.; Jan Lüttinghaus, Legal Science Tech: Quantität kommt von digital, ZEuP 2018, 1097 et seq.; Jan Max Wettlaufer,Vertragsgestaltung, Legal Techs und das Anwaltsmonopol, BB 2018, 55 et seq.

3 See Richard Susskind, The End of Lawyers? Rethinking the Nature of Legal Services, 2010; Richard Susskind, Tomorrow´s Lawyers. An Introduction to Your Future, 2d ed. 2017.

4 See Susskind, supra note 3. Concerning the emergence of the Legal Tech market in Germany, see, e.g., Dominik Tobschall/Johann Kempe, Der deutsche Legal-Tech-Markt, in Stephan Breidenbach/Florian Glatz (eds.), Rechtshandbuch Legal Tech, 2018, 25 et seq.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks58

Decisions that used to be made by humans are increasingly being made in an automated manner. Examples include automated administrative decision-making and forms of digitally mediated problem-solving.5 Legal Tech is increasingly playing an important role in e‑government6 and e-justice.7 Algorithmic regulation and governance by algorithms are other new magic words.

Of increasing importance is the transaction technology blockchain,8 which is conquering new fields, such as for storing legally relevant data in a confidential manner. The technology makes it possible to ensure the reliable compensation of accomplishments that are protected by copyright, to create digital registers, such as land registers, to allow safe cross-border online transactions, and much more.9

Commercial enterprises – even those with no previous experience in the field of law – have begun developing appropriate software for lawyers. They include several start-ups but also big players, such as IBM and Google. For instance, IBM is offering its artificial intelligence-based computer Watson/Ross. As a research tool it can sort through more than one billion text documents per second and it is becoming increasingly capable of handling and understanding legal text. It can also be used as an artificially intelligent attorney.10

Legal Tech is expected to generate considerable savings in terms of transaction costs, as well as to make the analysis of source materials and the preparation and making of decisions and their implementation faster, more efficient, and more effective. Legal technology also makes it possible to eliminate certain barriers to access to the law.

At the same time, Legal Tech raises various questions: Will it correctly capture or instead miss the complexity and multidimensional nature of conflicts that law is called upon to resolve? Will it adequately take into account the vagueness and ambiguity of legal terms and the problems related to the use of discretion? Will it reduce the diversity of the factors relevant for reaching decisions or instead allow them to increase? Will the rule of law be adhered to? Will there be sufficient guarantees of transparency? Are there risks that accountability for decisions and responsibility will become obfuscated and that abilities to control them will be degraded?11

The Irish philosopher John Danaher asks whether algorithmic governance might even pose a risk to the moral and political legitimacy of public decision-making processes. He speaks of a “threat of algocracy”,12 meaning a situation in which algorithm-based systems massively limit the opportunities for humans to participate in and understand decisions and thus the options for action by those affected by them.

A number of scholars are also beginning to focus on the myriad consequences that digitalisation has for social policy, including how jobs are being eliminated or radically changed. This also has an impact on the field of legal practice, especially consulting professions, with the result that especially lucrative legal work will become concentrated at large law firms.13 There are expectations that some small and medium-sized law firms will disappear.

C. Algorithmic systems, including learning systems

The digital transformation of society follows from the use of digital algorithms.14 Algorithms are rules that perform certain tasks in specified individual steps. Humans have long used them for routine tasks. Many machines are also controlled by algorithms, even manually operated weaving looms. But where algorithms are designed to be used in computers, they have to be written in a language that can be processed and read by a computer, that is, in a digital language. Another characteristic is the deterministic structure of programming, meaning the precise specification of the steps to be taken.

There are simple algorithmic systems and now – also using artificial intelligence – particularly intelligent IT systems. The latter include machine learning.15 The term refers to computer programs that are able to learn from records of past conduct and outcomes. In particular, the software has the ability to recognise patterns, evaluate images, translate language in texts, generate rules, and make predictions. It is used, for example, in search engines, for facial recognition, and for translating texts into other languages.

But the trend has not stopped there. The use of artificial


5 Concerning the latter, see Tom H. Braegelmann, Online-Streitbeilegung, in Markus Hartung /Micha-Manuel Bues/Gernot Halbleib (eds.), Legal Tech, Die Digitalisierung des Rechtsmarkts, 2018, 215 et seq.; Edwin Montoya Zorrilla, Towards a Credible Future: Uses of Technology in International Commercial Arbitration, SchiedsVZ 2018, 106 et seq.

6 On e-government, see, e.g., Bundesregierung, Digitale Verwaltung, BT-Drucks. 18/3074, 2014; Senat der Freien und Hansestadt Hamburg, Digital First – Chancen der Digitalisierung für eine bürgerfreundliche und moderne Verwaltung nutzen – Erweiterung der Strategie Digitale Verwaltung, Senatsdrucksache 2016/03060 of 11 October 2016; Margrit Seckelmann (ed.), Digitalisierte Verwaltung. Vernetztees E-Government, 2d ed. 2019.

7 On e-justice, see David Jost/Johannes Krempe, E-Justice in Deutschland, NJW 2017, 2703 et seq.; Wilfried Bernhardt, Quo vadis Digitalisierung der Justiz? Juris 2018, 310 et seq.

8 For more on this topic, see, e.g., Christoph Riegerer, Transparenz von Lieferketten durch die Blockchain, in: Breidenbach/Glatz (eds.), supra note 4, at 100 et seq.; Thilo Kuntz, Konsens statt Recht? Überlegungen zu Chancen und Herausforderungen der Blockchain-Technologie (manuscript), 2019.

9 See, e.g., Michèle Finck, Blockchains: Regulating the Unknown, 19 German Law Journal 2018, 665 et seq.

10 See https://futurism.com/artificially-intelligent-lawyer-ross-hired-official-law-firm. Generally see Kevin D. Ashley, Artificial Intelligence and Legal Analytics, 6th printing 2019, 15 et seq., 351 et seq.

11 See Thomas Wischmeyer, Regulierung intelligenter Systeme, 143 AöR 2018, 1, 18 et seq., 42 et seq.

12 John Danaher, The Threat of Algocracy: Reality, Resistance and Accommodation, 29 Philosophy & Technology 2016, 245 et seq.

13 Hariolf Wenzler, Big Law & Big Tech, in Hartung et al. (eds.), supra note 5, 77 et seq.

14 A critical analysis of the manifold functions of algorithms is provided by Rob Kitchin, Thinking Critically About and Researching Algorithms, Information Communication & Society 2016, 1 et seq.

15 On machine learning and its application in the legal sphere, see Harry Surden, Machine Learning and Law, 89 Washington Law Review 2014, 87 et seq.; Ethem Alpaydin, Machine Learning, 2016.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks59

intelligence as well as artificial neuronal networks16 makes it possible to largely simulate human ways of thinking and rules for acting.17 Another new buzzword is deep learning.18 Here, software is capable of enhancing digital programming created by humans and thus of evolving independently of such programming. Particularly strong systems can adapt on their own when confronted with new problem situations. These systems can also identify contexts, structures, and architectures on their own and improve their capability completely independently on the basis of the new information they gain. They can act reactively and proactively, and they can interact with other systems.

D. Can such systems be controlled?

Although human programming is still required at the outset of the development of such learning systems, they are however designed so as to do away as far as possible with human involvement in the enhancement of the software. One consequence of this is that humans are no longer able to fully understand how these systems work exactly. The U.S. researcher Andrew Tutt says about learning systems: “Even if we can fully describe what makes them work, the actual mechanisms by which they implement their solutions are likely to remain opaque: difficult to predict and sometimes difficult to explain. And as they become more complex and more autonomous, that difficulty will increase.”19 This finding obviously raises questions of responsibility, accountability, explicability, and comprehensibility, as well as opportunities for human supervision and judicial control, not to mention the ability to take corrective measures in the event of undesirable developments.

But some fear even farther-reaching consequences, including the risk that artificial intelligence could develop independently to such an extent as to endow itself with abilities that can no longer be controlled or undone by humans. It therefore comes as no surprise that warnings are being raised about the unrestricted use of artificial intelligence,20 even by individuals who themselves have helped to develop it and/or have intensively employed it. They include Elon Musk, the co-founder of PayPal and owner of Tesla, Steve Wozniak, the co-founder of Apple, and even Bill Gates of Microsoft and many other prominent actors.21

E. Big data and big data analytics

I will now turn to what is known as “big data”.22 The high capability of algorithm-based systems, including for their use in the area of Legal Tech, depends to a great extent on the availability of a large volume of data that have the highest possible quality and can be used at high speed. Computers can store considerably more data and use them without loss than humans can with their limited memory. Also, computers are generally faster than humans in analysing data.

Using so-called “big data analytics”, three basic abilities are employed to analyse data: description, prediction, and prescription.

1. With the aid of descriptive big data analytics, data are examined and processed for the purposes of analysis, and then prioritised, classified, or filtered. This may be useful for performing research, creating registers, and undertaking systemisation, as well as for compiling legal regulations and legally relevant materials, such as for analysing judicial precedents or opinions of law professors and legal practitioners.

2. Predictive big data analytics seeks in particular to identify correlations. The aim is, among other things, to gain an understanding of human behaviour, to identify emerging trends or behavioural patterns, and to predict behaviour. Predictive analytics is also used where the intention is to predict legally relevant decisions by analysing previous decisions, such as with the aim of aligning litigation strategy to the analysis.
Another prominent field of application is so-called “predictive policing”, which is used quite frequently in the U.S. and which has also recently been introduced in Germany.23 The aim here is to identify areas or individuals that may pose a potential threat. Based on this, criminal policing activities are to be concentrated in as targeted a manner as possible on the corresponding individuals or areas – if necessary, by accepting the associated risks that certain groups of the population will suffer stigmatisation and discrimination.

3. The third type, prescriptive big data analytics, is designed to help develop recommendations and strategies for action. enumerate


16 See generally Stuart Russell/Peter Norvig, Künstliche Intelligenz, 3d ed. 2012; Julian Reichwald/Dennis Pfisterer, Autonomie und Intelligenz im Internet der Dinge, CR 2016, 2008 et seq. Wolfgang Ertel, Grundkurs Künstliche Intelligenz, 4th ed. 2016; Künstliche Intelligenz verstehen als Automation des Entscheidens (Leitfaden), 2017, available at https://www.bitkom.org/sites/default/files/file/import/Bitkom-Leitfaden-KI-verstehen-als-Automation-des-Entscheidens-2-Mai-2017.pdf.

17 See. Ashley, supra note 10.

18 See Ian Goodfellow/Yoshua Bengio/Aaron Courville et al., Deep Learning (Adaptive Computation and Machine Learning), 2016; Felix Stalder, Kultur der Digitalität, 2016, 177 et seq.; Stefan Kirn/Claus D. Müller-Hengstenberg, Intelligente “Software-Agenten”: Eine neue Herausforderung für unser Rechtssystem? Multimedia & Recht 2014, 307 et seq.

19 Andrew Tutt, An FDA for Algorithms, 69 Administrative Law Review 2017, 83, 102.

20 Having also been alerted to this, public authorities have initiated processes to monitor and analyse the use of artificial intelligence. See, e.g., Big Data: A Report on Algorithmic Systems, Opportunity and Civil Rights (May 2016), available at https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/2016_0504_data_discrimination.pdf; Big Data meets artificial intelligence: Challenges and implications for the supervision and regulation of financial services (16 July 2018), available at https://www.bafin.de/SharedDocs/Downloads/EN/dl_bdai_studie_en.html.

21 See, inter alia, Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, 29 Harvard Journal of Law & Technology 2016, 353, 355.

22 See Viktor Mayer-Schönberger/Kenneth Cukier, Big Data: A Revolution That Will Transform How We Live, Work and Think, 2013; Jürgen Taeger, Big Data & Co: Neue Herausforderungen für das Informationsrecht. Tagungsband Herbstakademie, 2014; Executive Office of the President, Big Data: Seizing Opportunities, Preserving Values, 2014.

23 For more on this topic, see Sabine Gless, Predictive Policing und operative Verbrechensbekämpfung, in: Felix Herzog/Reinhold Schlothauer/Wolfgang Wohlers (eds.) Rechtsstaatlicher Strafprozess und Bürgerrechte, 2016, 165 et seq.; Simon Egbert, Siegeszug der Algorithmen? Predictive Policing im deutschsprachigen Raum 32/33 APuZ 2017; Timo Rademacher, Predictive Policing im deutschen Polizeirecht, 142 AöR 2017, 366 et seq.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks60

F. Quality assurance24

The use of digital processes, including in the area of Legal Tech, depends highly on the quality of the data being processed, the information transmitted by them, and the software used for processing. Data quality includes – inter alia – the integrity and availability of the information conveyed with the aid of the data, in some cases also the safeguarding of confidentiality. Also, data have to be kept current, such as through readjustment when there is a change in the de facto basic conditions or in legal requirements.

Quality also includes the security of the data and the software. Security problems may arise, for example, when data are processed using external clouds that are under the control of other actors, such as commercial undertakings, and may be susceptible to attacks by third parties. At the moment, the creation of data security is an especially important task that has yet to be successfully managed, especially in view of the diverse opportunities for hacker attacks.25

Quality problems may also result from the fact that digital programs are created through division of labour, often without the ability to specifically account for the respective contributions by the various actors and the control effects that they trigger. Also, for those participating in the process, it is not always evident which programming consequences are caused by the actions of which actors and how far they are compatible with one another.

Moreover, it normally has to be assumed that, when a variety of components created by outside parties are used during the design of hardware and software, their effects on programming may not be fully understood. The result may be programs with a dysfunctional design.

From a legal standpoint, software quality means, in particular, that programming takes into account all legally relevant factors and excludes those that are normatively undesirable, such as valuations in contravention of norms or influenced by impermissible motives. In particular, it must be assured here that discriminatory parameters are not built into the design of the algorithmic system.26

G. Differences between social and technical constructs

Formerly, legal decisions were (exclusively) “human-made”. Human decisions are social constructs. They are developed in specific contexts of an organisational, procedural, or cultural nature and are based on the relevant skills of the human decision-maker.27 The same applied and still does apply to the input of data that are entered by humans in order to be processed by computer – such as the legal norms, facts, and circumstances that are important for resolving conflicts.

By contrast, the approaches used in algorithmic systems are technical constructs, even though the programs were created by humans and input is entered by humans. The technical process differs from human decisions in a variety of ways.

First, based on their current technical performance capabilities, algorithms do not possess all the abilities that are distinct to humans. For instance, algorithms lack the ability to use so-called “implicit knowledge”. This means knowledge that is inherent to humans based on their earlier experiences, even where they cannot expressly name it. Where a computer is being programmed, it is reliant on the fact that the applied knowledge is explicit, in other words that the knowledge can be translated into a computer-capable language.

In addition, computers lack some other abilities – at least sufficient abilities – that are important for some legal decisions. These include abilities concerning empathy, the development of creativity, and the use of intuition, which is also important for lawyers – in Germany, this is called “Judiz”. Moreover, algorithms reach limits (at least so far) with regard to the argumentatively derived interpretation of meaning,28 which is important when norms are to be interpreted. Algorithms are furthermore limited in their ability to undertake a complex balancing of all considerations and to calibrate the criteria for balancing in a way that satisfies the relevant conditions for their legal application.

Although some of these deficits cannot be eliminated, it is in many cases possible to gloss over them through simulation or through the use of correlations obtained with statistical procedures, often also in such a way that something no longer appears to be a deficit at all.

At this point, some may doubt whether some of the human abilities I have just described are particularly important when dealing with law. Others may fear that such abilities may be employed by some persons in an undesirable manner, such as to reinforce prejudices or as a tool for manipulation and discrimination. It is no secret that humans are capable of socially undesirable behaviour and on occasion also employ it. However, it also should not be overlooked that corresponding deficits – such as latent discrimination – can also be built into software programs and then, undetected, find expression in countless decisions influenced by them.


24 See Michael Pruß/Frank Sarre, Datenqualität – Ein beherrschbares Thema aus technischer und juristischer Sicht im Zeitalter der Digitalisierung, in Jürgen Taeger (ed.), Rechtsfragen digitaler Transformation – Gestaltung digitaler Veränderungsprozesse durch Recht, 2018, 545 et seq.

25 See generally Thomas Wischmeyer, Informationssicherheitsrecht. IT-Sicherheitsgesetz und NIS-Richtlinie als Bausteine eines Ordnungsrechts für die Informationsgesellschaft, 50 Die Verwaltung 2017, 155 et seq.; Kipker Mueller, International Regulation of Cybersecurity – Legal and Technical Requirements, MMR-Aktuell 2018, 414291 et seq.

26 See, e.g., Engin Botzdag, Bias in Algorithm Filtering and Personalization, 15 Ethics and Information Technology, 2013, 209 et passim; Omer Tene/Jules Polonetsky, Big Data for All: Privacy and User Control in the Age of Analytics, 11 Northwestern Journal of Technology and Intellectual Property 2013, 239 et seq.

27 For further details, see Wolfgang Hoffmann-Riem, Verhaltenssteuerung durch Algorithmen – eine Herausforderung für das Recht, 142 AöR 2017, 2 et seq., 26 et seq.

28 See Mireille Hildebrandt, Law as Computation in the Era of Artificial Legal Intelligence. Speaking Law to the Power of Statistics, 68 University of Toronto Law Journal 2018, 1 et seq.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks61

H. Modelling law in digital software: Opportunities and difficulties

When digital technologies are employed to interpret and apply law, it must be assured that legal requirements are complied with. In general, it is possible to translate legal rules into technical rules. For this purpose, standardisation is necessary, since action by a computer requires clear language commands. In some fields, the norms are unambiguous and thus can be easily converted into algorithmic rules. Under these circumstances, software programming can easily satisfy the requirements concerning, in particular, the rule of law. This is also the case where the applicable facts and circumstances can be compiled unambiguously in a digital manner – for instance, in class-action and other large-scale proceedings or in some fields of taxation.

However, norms conceived in human language are in many cases characterised not by unambiguousness but instead by the fact that terms are vague and open to different interpretations.29 Also, norms often contain multiple terms, and when they interact with one another, this may create room for interpretation and different applications. Transferring such norms into software programming leads to a risk of reducing or even changing the substance of law.

However, legal practice and legal studies have come up with a number of suggestions for how terms and norms that are vague and open to interpretation can be given greater specificity. Precedents, in other words earlier decisions, can be drawn upon for the purpose of specification if the approaches taken in such decisions have broad acceptance. In Anglo-Saxon legal circles, this is facilitated by the fact that judicial precedent is recognised as having considerable weight. In Germany, while precedents are also significant,30 legal dogma (“Rechtsdogmatik”) is of particular importance.31 In both legal cultures, it is nevertheless possible to deviate from such previous consensuses and to understand terms in a different way, such as in the face of changed circumstances. If we are deprived of this ability through digital programming, this would create a risk that law will no longer be able to respond appropriately to social or economic change.32

In particular, unambiguous programming is lacking in the case of norms that specify aims and purposes but not, or to only a limited extent, the permissible means for their realisation. Decision-making leeway also exists where it is necessary to balance competing considerations. This is a familiar problem, for example, with risk law, where the possible or required measures have to be determined in accordance with the value of the legally protected interest that is in jeopardy. One formula here reads: As the importance of the jeopardised legally protected interest increases, the requirements with respect to the likelihood of impending damage must be lowered accordingly.33

Legal requirements are ambiguous, in particular, where norms permit the exercise of discretion, require that predictions be made, or are designed to contribute to the planning and shaping of future structures. I have already mentioned the difficulties associated with the balancing of considerations. One example of the need to undertake such balancing, especially in the field of public law, is the application of the principle of proportionality.

Also fraught with uncertainty is the outcome of decisions in so-called “dilemma situations”, meaning situations in which all of the alternative decisions available will result in damage. How such situations are to be handled is currently the subject of intense discussions with respect to autonomous driving.34 How is the program to decide by algorithm when in a certain traffic situation the automobile in transit has only two choices: drive into a group of children at play or into a group of senior people waiting at a bus stop?

There are many other situations in which legal programmes call for decisions that in legal terms are not clearly or definitively pre-programmed. This is also the case where the legal system requires that the decision-maker must also apply subjective criteria. Examples include verifying whether duties of care were breached and applying standards of fault, such as negligence.

To the extent that the issue has to do with evaluating whether a fact is true – for instance, in connection with the assessment of evidence – the legal system, at least in Germany, expressly requires that the outcome depends on whether the decision-maker is convinced. Conviction is without doubt a subjective category in need of detailed specification. The same goes for the prediction of the recidivism risk of criminal offenders, which is intended to aid in making a decision about whether a sentence may be suspended. In the U.S., such decisions are often made on a purely automated basis35 – an unusual concept in German legal culture.

I. Supplemental controlling factors by applying law

This is exacerbated by a further problem. Because of such open decision-making situations, the legal system builds on the fact that in addition to norms conceived in language,


29 See Wolfgang Hoffmann-Riem, Innovation und Recht – Recht und Innovation, 2016, 80 et seq.; Thilo Kuntz, Recht als Gegenstand der Rechtswissenschaft und performative Rechtserzeugung, 216 AcP 2016, 866 et seq., with further references.

30 See Mehrdad Payandeh, Judikative Rechtserzeugung, 2017.

31 See generally Christian Bumke, Rechtsdogmatik: Eine Disziplin und ihre Arbeitsweise. Zugleich eine Studie über das rechtsdogmatische Arbeiten Friedrich Carl von Savignys, 2017.

32 See Kuntz, supra note 8, sub III,1a.

33 Regarding this issue, see Matthias Klatt/ Johannes Schmidt, Spielräume im öffentlichen Recht, 2010, 9 et seq.; Ralf Poscher, Eingriffsschwellen im Recht der inneren Sicherheit, 41 Die Verwaltung 2008, 345 et seq.

34 See, e.g., Philipp Weber, Dilemmasituationen beim autonomen Fahren, NZV 2016, 249 et seq.; Bundesministerium für Verkehr und digitale Infrastruktur, Ethik-Kommission automatisiertes und vernetztes Fahren, report from June 2017, at 16.

35 Nancy Ritter, Predicting Recidivism Risk: New Tool in Philadelphia Shows Great Promise, 271 National Institute of Justice Journal 2013, https://nij.gov/journals/271/Pages/predicting-recidivism.aspx; Julia Angwin/Jeff Larson/Surya Mattu/Lauren Kirchner, Machine Bias, 2016, available at https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing; Grant Duwe/Michael Rocque, Effects of Automating Recidivism Risk Assessment on Reliability, Predictive Validity, and Return on Investment (ROI), 16 Criminology & Public Policy 2017, 235 et seq.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks62

there are additional factors that have an impact on the interpretation and application of law.36 These may be called supplemental controlling factors, such as the significance of the organisation making the decision. Thus, it often makes a difference if the decision is made by a court – whether an individual judge or a collective body – or by an administrative authority. The procedure used to make the decision may also be relevant to it, such as whether and how a legal hearing is granted and whether there is access to all relevant files. But above all, it is the specific experiences and value orientations of the individuals charged with making the decision that are important, too.

If in a given case the norm has to be specified as a norm designed to deal with the conflict at hand – in the literature on methodology, some Germans call this creation of an “Entscheidungsnorm”37 – and if this task is no longer entrusted to individuals charged with applying the law but instead to a computer program, this results in a change in the decision-making factors that are imparted by the controlling factors organisation, procedure, and personnel.38 However, such contexts are important for the quality of the requirements. A software program that is designed to make automated decisions and that operates in a deterministic manner is developed under completely different contextual conditions than those surrounding the creation of a norm by the legislator and its application to a given case. The interpretation and application of the law very often require a specification that relates to the given contexts, even those of the specific conflict.

Using techniques like categorical regulations to pre-empt the importance of such factors would not be a proper solution. This would require a return to the legal formalism school of jurisprudence, which for good reason has become outdated.39 Freezing one specific interpretation – a kind of “digital neo-positivism” – is not an adequate response to the challenges currently being faced in the interpretation and application of law.

J. Designing software with the involvement of lawyers and non-lawyers

It should be kept in mind that the actors involved in developing the requirements for the specific design of the software architecture are different than those who are normally involved in making and applying the law. The design of the software comprises coding for electronic data processing, software testing and, potentially, its revision following experience gained in testing or real-time operation. By no means are lawyers always employed here, let alone the only ones employed. IT experts in particular are involved. Moreover, the programming process – that is, the procedure – is not governed by legal rules: to date, there are no procedural requirements concerning the creation of software, nor is it assured that only legal decision-making factors, or at least only those that are legally legitimated, are built into the programming. Furthermore, the programming process as such is normally not subject to any legal control, even though this could by all means be provided for, such as through procedures for a proactive certification of certain digital programs.

In view of the deterministic structure of the traditional approach taken by algorithms and the constraints that this occasions, programmers may be tempted to treat the relevant norm requirements as being unambiguous, even where they are not. They may also succumb to the temptation to simply feign this if need be.

K. Automated administrative decisions – using the example of German law

It is not possible to address the broad range of potential applications and challenges here. Since most discussions and academic work on legal technology and operations involve the use of Legal Tech by lawyers and focus on the fields of civil and criminal law, I would first like to deal instead with a different area, namely, the extent to which public administration and the courts are entitled to make digitally automated decisions.40 Second, I will deal with automated sanctioning of violations of legal obligations (Section M, below). I will restrict my remarks here to the current situation in Germany.

By way of note, I should first mention that algorithms have long been employed by the public administration and the courts in their daily work, including the preparation of decisions, for instance, when it comes to research or systemisation. Certain administrative decisions also have long been issued electronically and sent out without individual control by the case officer, such as pension notices or salary and benefits statements.41

Also, express rules have recently been enacted about when the public administration may make decisions on a fully automated basis. In Germany, they can be found in the Administrative Procedures Act,42 in the Fiscal Code,43 and in Book Ten of the Social Code.44 They show that German legislators are taking a very cautious approach here.45 The


36 See Hoffmann-Riem, supra note 29, at 180 et seq.

37 Concerning this concept, see Friedrich Müller/Ralph Christensen, Juristische Methodik, vol. I: Grundlagen für die Arbeitsmethoden der Rechtspraxis, 11th ed. 2013, in particular, paras. 233, 274; Hoffmann-Riem, supra note, 29 at 60-61, 80 et seq.; Kuntz, supra note 29, 867, 873 et seq.

38 See Hoffmann-Riem, supra note 29, at 97-98.

39 See Hoffmann-Riem, supra note 27, at 17. According to Klaus Wiegerling, Daten, Informationen, Wissen, in: Breidenbach/Glatz (eds.), supra note 4, at 23, behind the “datafication” lie “positivistic metaphysics.”

40 I will not deal with the provisions on automated decision-making in Article 22 of the EU General Data Protection Regulation (Regulation (EU) 2016/679), which is binding on all EU Member States.

41 See, e.g., Hans Peter Bull, Der „vollständig automatisiert erlassene“ Verwaltungsakt – zur Begriffsbildung und rechtlichen Einhegung von E-Government, DVBl 2017, 409, at 409 et seq.; he also shares ideas on the question concerning the point at which an administrative decision should be considered fully automated (410-11).

42 Verwaltungsverfahrensgesetz. See, inter alia, sections 3a, 35a, 37 (2) and (3), and 41 (2).

43 Abgabenordnung. See, in particular, sections 155 (4) and 149 (4). Regarding related legal issues, see Julius Helbich, Rechtsfragen der “automatisierten” Ermessensausübung im Steuerrecht, DStR 2017, 574 et seq.; Christian Ahrendt, Alte Zöpfe neu geflochten – Das materielle Recht in der Hand von Programmierern, NJW 2017 at 500, 537 et seq.

44 Sozialgesetzbuch. See, e.g., section 31a of Book Ten.

45 See Nadja Braun Binder, Vollständig automatisierter Erlass eines Verwaltungsaktes und Bekanntgabe über Behördenportale, DÖV 2016, 891 ff.; Thorsten Siegel, Automatisierung des Verwaltungsverfahrens, DVBl 2017, 24 et seq.; Ariane Berger, Der automatisierte Verwaltungsakt, NVwZ 2018, 1269 et seq.;

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks63

legislators continue to rely primarily on decisions made by humans. Fully automated decisions are permissible only where the norms in question do not allow the exercise of discretion and there is no assessment leeway with regard to the application of undefined legal terms.46 Put another way: Where leeway exists in making a decision because the law is, for instance, vague or ambiguous, particularly where legally relevant interests have to be assessed and balanced against one another, the legislators consider the human factor to be indispensable to a just decision.47 In addition, German legislators themselves have to clarify with a legal provision whether the prerequisites are met for automated decision-making in the respective legal areas.

The facts and circumstances that are important for an administrative decision, in other words, the underlying legal conflict in particular, may generally be ascertained by the administration with the assistance of an automated facility. However, the Administrative Procedures Act specifies that if the parties to the conflict make factual assertions that are significant in the given case, a natural person must examine whether this is relevant for the decision, that is, whether he or she needs to supplement or modify the facts and circumstances ascertained in an automated manner to a material extent.

Automated administrative acts are subject to the same constitutional guarantee of judicial review as other administrative acts. However, as of now, courts in Germany are not permitted to make automated decisions.48 It also does not suffice that they employ algorithms used by the public administration for the subsequent judicial review.49 The reach of a court decision, and thus also the criteria to be applied to it, are not identical to those that are controlling for the administration. In particular, a judicial review procedure is not structured like a procedure for issuing a new administrative act.

Particular problems arise when an automated administrative decision is able to be comprehensively reviewed for its correctness only if the court deals with the underlying software, i.e. the automated decision-making program and its handling of the specific conflict. That would be made easier if the employed software could itself provide information about the reasons supporting the decision in a language understandable to humans. This type of “explainable artificial intelligence” has not yet been developed to the point of application, but it is being worked on.50

Effective judicial control also requires that the algorithms employed by the administration are disclosed to the courts. The extent to which algorithms, or at least their underlying criteria and precepts, have to be disclosed to the courts has yet to be legally clarified in Germany. In the case of learning algorithms, the training programs, and even the testing programs, would also have to be accessible.51

Even where such criteria are disclosed, it remains doubtful whether judges, who in most cases are not algorithm experts, will be able to undertake an effective review. In addition, in the case of learning software, it must be taken into consideration that not even the specialists, let alone the specific IT programmers, know and are able to understand how the currently employed software – which was potentially modified since the initial programming – functioned in detail. I refer to the statement by Andrew Tutt that I mentioned earlier.

To the extent that, as is frequently the case, the algorithms are treated as business secrets or as official secrets with respect to the parties to administrative or judicial proceedings, the parties for their part have no opportunity to uncover errors in the algorithms or their application and to ask the court for a specific review.52

Thus, in Germany – but also elsewhere – there are still substantial obstacles associated with the use of certain automated decision-making systems by state authorities. But I don’t believe that the situation will remain like this. The tentative steps that have been taken so far in Germany also constitute an effort, on the one hand, to overcome opposition and, on the other, to gather experiences that can later form the basis for opening up additional fields of application.

L. Automated sanctioning of violations of legal obligations

Finally, I would like to address the opportunities for automated enforcement of the compliance with legal obligations.

I will illustrate the problem with two examples, and although they do not reveal the great complexity of the fields of application, they highlight the point that I wish to make. The first example has to do with an apartment, rented under special conditions: If the tenant does not pay the rent on time and the apartment door is equipped with the requisite technical features, he or she is automatically locked out of his or her apartment without prior notice and is no longer


46 See section 35a of the Administrative Procedures Act, supra note 42. This provision has also been transposed into the Administrative Procedures Acts of the Federal “Länder”, albeit partly in a modified form. However, the prohibition is not designed to be controlling simply when the norm, as worded, permits discretion or uses a vague legal term, insofar as decision-making latitude is constrained by an administrative directive to reach certain decisions in cases of this type or by the administration’s obligation to comply with decisions it had previously made (“Selbstbindung durch ständige Verwaltungspraxis”). On that problem, see Christian Djeffal, Das Internet der Dinge und die öffentliche Verwaltung – Auf dem Weg zum automatisierten Smart Government? DVBl 2017, at 808, 814.

47 As argued by Lorenz Prell, in BeckOK VwVfG, 39th ed. (1 April 2018), section 35a, para. 14. For a discussion of the possibilities and restrictions of machine learning for decisions that include the exercise of discretion, see Viktoria Herold, Algorithmisierung von Ermessensentscheidungen durch Machine Learning, in: Taeger (ed.), supra note 24, 453 et seq.

48 See Peter Enders, Einsatz künstlicher Intelligenz bei juristischer Entscheidungsfindung, JA 2016, 721, 723.

49 See, e.g., Wischmeyer, supra note 11, at 57.

50 See Wischmeyer, supra note 11, at 61, with further references in note 247.

51 See Thomas Hoeren/Maurice Niehoff, KI und Datenschutz – Begründungserfordernisse automatisierter Entscheidungen, in: Rechtswissenschaft 2018, at 57 et seq.

52 One example: With regard to risk management systems under tax law, section 88 (5) of the Fiscal Code, supra note 43, expressly prohibits the reporting of details, to the extent that doing so would prejudice the equality and legality of taxation. See, e.g., Mario Martini/David Nink, Wenn Maschinen entscheiden … – vollautomatisierte Verfahren und der Persönlichkeitsschutz, 10 NVwZ-extra 2017, at 1, 10.

Hoffmann-Riem, Legal Technology – Preconditions, Opportunities, and Risks64

able to enter it and use it, even where a small child is still inside it. The second example involves a vehicle leased under special conditions: If the operator does not pay the lease charge when due and the vehicle has the appropriate sensors and connectivity, the vehicle can no longer be started.

Such possibilities for automated sanctioning exist in the important area of so-called “smart contracts”, where the terms of the agreement between the parties are written directly into digital code, which very often exists across the decentralised blockchain network.53 This technology makes transactions traceable, transparent, and irreversible, and breaches of contract can be sanctioned easily.

Automated sanctioning is also relevant with regard to filter technologies that prevent violations of the law, such as the dissemination of hateful or racist content on the internet, known as “content curation”. In addition, it can be used as a tool to prohibit the unauthorised use of works protected by copyright,54 a problem that the EU Directive on Copyright in the Digital Single Market has addressed. While the Directive has broad public support, it is not without its critics, especially as relates to the risk of infringements of the freedom of opinion and information associated with the use of upload filters.

These examples of automated sanctioning show that law can be exploited for purposes other than merely specifying what someone may or may not do, with the affected person then deciding on his or her own whether to comply with the norm. In the case of automated sanctioning, technology excludes the interposition of a deliberate decision by the person involved. It is not even necessary for the affected person to have first been threatened with such a sanction, which would enable him or her to decide on his or her own whether to comply with a rule or disregard it. Such technically implemented rules are self-executing.55 In contrast to the sanctioning of violations of the law under the rules of civil procedure or administrative procedure, their self-executing application is an expression of a power asymmetry: The sanctioned individual has no chance to lodge objections.

Awareness of an impending sanction is one of the means for safeguarding the autonomy of those affected by it: As beings who think and are also capable of ethical action, they can decide on their own whether to comply with the legal proscription or whether there are instead reasons not to do so and, potentially, to risk a sanction. Under certain circumstances, the ability to refuse to follow a rule may even be desirable,56 for instance where conducting oneself not in conformity with a rule is more consistent with its meaning than “blindly” following it. This may be the case in the dilemma situations mentioned earlier. A relatively harmless example is where a driver decides to disobey a traffic law in order to avoid an accident. The sociologist Niklas Luhmann coined the term “useful illegality” to describe such special situations.57

M. Outlook

There are many more examples of how novel approaches are making use of digitalisation. The more that digital transformation encompasses the legal system, the more important it is that the involved parties have the corresponding skills.

Especially for tomorrow’s lawyers, it will no longer be sufficient to learn and practice law in the way that was typical for the analogue world. One aim ought to be to gain abilities to use the new technology.58 But it is also important to reflect on what digitalisation is bringing about. If a contract is concluded in an automated manner, and if a breach of it is likewise sanctioned in an automated manner, this has consequences for the way that law is employed and thus for the way that interests are protected. The same applies where the issuance of administrative acts is left to non-transparent algorithms and vaguely understood software. It also makes a difference if the review of lawfulness is handled by self-learning machines instead of procedures governed by the rule of law involving persons trained in the law and acting with judicial ethics. The changes are not limited to individual decision-making processes. They may also have an impact on society’s acceptance of the law, on the role of law in settling disputes, and ultimately on the legitimation of the legal system and thus on its recognition as being just. I certainly do not mean to suggest that the use of legal technology as such should be considered negative or that application of law in the “analogue age” was preferable. However, the consequences of this need to be analysed and assessed. Moreover, adequate opportunities for correction have to be created so that any undesirable developments can be remedied.

The more that digitalisation changes our social life, the more important it becomes to ensure transparency, responsibility, and accountability, as well as public and judicial control. It is essential to prevent citizens from being treated as uninformed, unthinking, or even submissive objects. It is also important for lawyers to continue to maintain a critical distance to the things that they are engaged with.


53 For more on this topic, see Martin Heckelmann, Zulässigkeit und Handhabung von Smart Contracts, NJW 2018, 504 et seq.; Klaus Eschenbruch, Smart Contracts, NZBau 2018, 3 et seq.; Martin Müller, Bitcoin, Blockchain und Smart Contracts, ZfIR 2017, at 600; Markus Kaulartz, Rechtliche Grenzen bei der Gestaltung von Smart Contracts, in: Jürgen Taeger (ed.), Smart World – Smart Law? Weltweite Netze mit regionaler Regulierung, 2016, 1023 et seq.

54 See Kevin Dankert, Normative Technologie in sozialen Netzwerkdiensten, 49 KritV 2015, at 56-57; Omer Tene/Jules Polonetsky, Taming the Golem: Challenges of Ethical Algorithmic Decision-Making, 19 North Carolina Journal of Law & Technology 2017, at 125, 154 et seq.

55 See Wolfgang Schulz/Kevin Dankert, Die Macht der Informationsintermediäre, 2016, II.3.B.

56 Timo Rademacher, Wenn neue Technologien altes Recht durchsetzen: Dürfen wir es unmöglich machen, rechtswidrig zu handeln?, JZ 2019, 702 et seq.

57 Niklas Luhmann, Funktion und Folgen formaler Organisation, 2d ed. 1972, 304 et seq.

58 See Susskind, Tomorrow’s Lawyers, supra note 3.