Algorithmic systems are being implemented in a growing number of areas and are being used to make decisions that have a profound impact on our lives. They involve opportunities as well as risks. It is up to us to ensure that algorithmic systems are designed for the benefit of society. The individual and collective freedoms and rights that comprise human rights should be strengthened, not undermined, by algorithmic systems. Regulations designed to protect these norms must remain enforceable. To achieve this objective, we’ve developed the following Algo.Rules together with a variety of experts and the interested public.
The Algo.Rules are a catalogue of formal criteria for enabling the socially beneficial design and oversight of algorithmic systems. They provide the basis for ethical considerations as well as the implementation and enforcement of legal frameworks. These criteria should be integrated from the start in the development of any system and therefore be implemented by design. Given their interdependence on each other, the Algo.Rules should be treated as a composite unit. Interested stakeholders and experts are invited to join us in developing the Algo.Rules further and to adopt them, adapt them, expand them and, above all, explore opportunities to apply them in practice. Dynamic by design, the Algo.Rules should be fine-tuned, particularly in terms of their practical implementation.
The term “algorithm” refers to a set of precise instructions or rules regarding actions to be taken in solving a predefined problem. An algorithmic system is a system comprised of one or more algorithms used in a software to collect and analyze data as well as draw conclusions as part of a process designed to solve a pre-defined problem. The system can involve machine learning or follow pre-programmed decision-making rules. Drawing on Algo.Rules to evaluate an algorithmic system includes taking into consideration the broader socio-technical context in which the software is embedded. This involves, for example, considering how results are interpreted and how this informs the user of a system’s decisions. The Algo.Rules apply to the entire process of algorithmic system development as well as their embeddedment within a social context.
The Algo.Rules focus on those algorithmic systems that have a significant impact on society or individual lives, regardless of whether this involves direct or indirect effects. It is therefore not a question of establishing a set of design rules for all algorithmic systems but, rather, for those that are socially relevant. In order to determine whether an algorithmic system is socially relevant, an impact assessment should be conducted before it is designed. The stronger the potential influence of an algorithmic system on society or people’s lives, the more carefully it should comply with the Algo.Rules.
The Algo.rules address everyone who signficantly influences the creation, development, programming, implementation or the effects of an algorithmic system, as well as everyone who has commissioned the development or integration of such a system. We purposefully do not limit the scope of our attention to programmers. This is because not only their programming codes, but also by their objectives, training data, organizational context and the ways in which a system’s results are presented, interpreted and applied influence the system’s effects. A system’s goals are often determined by the client or executives. Should the system help generate the largest possible profit, solve a problem as fast as possible or should societal interests be the primary consideration? Operators and software designers also have considerable influence on the effects brought about by an algorithmic system. They determine the presentation and application of a decision made by said system in practice. The Algo.Rules establish a uniform set of guiding principles for all of these individuals.
The Algo.Rules address in particular (but not limited to) the following groups:
The Algo.Rules describe the overarching rules that stakeholders should observe if they want to design and use algorithmic systems ethically. For them to be useful in everyday work routine, we have specified these rules. We offer practical aid which are directly addressed to and personalized for different target groups. They explain what the Algo.Rules mean for developers and executives and provide recommendations on how public sector organizations can implement the Algo.Rules. In the course of the project we develop various practical guides with specific suggestions. Some of these guides are available in English.
We invite interested parties to adopt, adapt and extend the Algo.Rules and to integrate the rules into their daily work. For this purpose, all project results are available freely and under a CC License here:
The ways in which algorithmic systems effect society cannot be reduced to a matter of programming codes alone. Just as relevant are a system’s goals and their underlying values, the data flowing into the system, and how results are presented and interpreted. The application of algorithmic systems raises technical as well as ethical and legal issues. We have therefore taken care to integrate a diverse goup of actors and perspectives into the process. This includes representatives from:
Our group of experts played an important role here as major contributors to the development of the Algo.Rules.
In addition to the complexity and diversity of algorithmic systems, the Algo.Rules must consider the different stages of a process and the people participating along the way. In order to make sure the rules remain understandable to everyone, we have been as specific as possible and as general as necessary. It makes little sense to formulate such rules behind closed doors and without referring to any practical application. Therefore, we have had the quality criteria of a binding code of professional ethics such as the Hippocratic Oath and the Press Code analyzed. Findings show that participatory approaches to development are one of the most important factors of success with such codes. Our work therefore focuses on developing and refining the rules together with a large number and diverse group of individuals and organizations who, as part of an open and dynamic process, contribute their knowledge. This includes IT experts and representatives of other disciplines such as the humanities and law, as well as individuals who work in politics, civil society and business. The effects of algorithmic systems concern us all. This is why, in addition to our workshops and consultations with experts, we have engaged with the broader public through an open online survey and by discussing the Algo.Rules on panels and at meetups. The Algo.Rules are a joint effort. The process by which the Algo.Rules are developed is coordinated by the Bertelsmann Stiftung and the think tank iRights.Lab. We have defined nine Algo.Rules. Currently, we are working on models for the practical application of each rule. We have refined the Algo.Rules for two so-called focus groups and the context of their application: For developers of algorithmic systems we published a guidebook. In an impulse paper we discuss the role of executives and organizations. Furthermore, we are focusing on the public sector. For civil servants we compile support to implement the Algo.Rules in the planning, development and operation of algorithmic systems.
Fon: +49 5241-810
Fax: +49 5241-81681396
E-Mail: carla.hustedt (at) bertelsmann-stiftung.de
Fon: +49 30 40 36 77 230
Fax: +49 30 40 36 77 260
Algorithmic systems have been a part of our daily lives for quite some time. They make decisions for and about us by filtering job applications, delivering diagnoses regarding our health or assessing creditworthiness. It is therefore important for us to discuss how the use of this technology can present us all with more opportunities than it does risks. We believe the design of algorithmic systems must therefore follow certain rules. The Algo.Rules are our proposed set of principles. Our goal is to see the Algo.Rules applied in all relevant algorithmic systems.
Those of us involved with the “Ethics of Algorithms” project at the Bertelsmann Stiftung initiated the process for developing the Algo.Rules. We want to contribute to the design of algorithmic sytems that result in a more inclusive society. Social worthiness – not technical feasibility – should be the guiding principle here. One of the overriding objectives of the Stiftung’s work is to ensure that digital transformation serve the needs of society. We aim to strengthen the voice of civil society, build bridges between disciplines, foster collaboration between theory and practice, and take a proactive approach to solving problems. We believe that developing rules for the design of algorithmic systems is one of many promising strategies which, in combination with each other, can help ensure that algorithms serve the common good. The independent think tank iRights.Lab has been commissioned to manage the development process. iRights.Lab has several years of experience with taking a transdisciplinary approach to developing complex systems and structures with a diverse network and making these systems relevant for practical use. Together, we aim to drive the public debate further and make a positive contribution to society in this field.
The Algo.Rules were created in an interdisciplinary, multisectoral and open process. We work in an interdisciplinary manner because the effects of algorithmic systems can only be understood through the convergence of different perspectives. We incorporate a variety of perspectives because actors from academia, civil society, politics and the business sector should engage in more extensive dialogue with one another. We have also chosen to pursue a fundamentally open approach, because the future of our digitalized society concerns all of us.
The process was launched at a workshop in May 2018. The theoretical groundwork for the Algo.Rules was laid in the context of two studies (on success factors for codes of professional ethics, and on the strengths and weaknesses of existing compendia of quality criteria for the use of algorithms), as well as through the consideration of numerous other sets of principles for algorithmic systems. In parallel, we consulted with 40 experts from the political, business, civil society and academic sectors over the course of the summer, asking them for intensive feedback. A further workshop in the autumn of 2018 addressed issues related to the implementation of the Algo.Rules. Finally, at the end of that year, the general public was invited to take part and contribute ideas through an online participation process. After evaluating this process, the Algo.Rules were launched in March 2019 at a press conference.
Subsequently, we worked on specifying the Algo.Rules for two focus groups: developers and executives. For this purpose, two workshops and numerous consultations with experts from the field took place during 2019. The resulting guidebook, published in June 2020, provides information on how each of the Algo.Rules can be implemented in practice. An impulse paper describes the role of executives in this process.
The project Algo.Rules is part of a group of initiatives that aim to promote the design of algorithmic systems for the common good. To start an exchange of ideas, we organized a workshop with experts from all over Europe.
In order to put Algo.Rules into practice beyond that, we participated in the development of an AI-Ethics-Label. As part of the AI Ethics Impact Group, we developed a proposal to help put general AI ethics principles into practice. The label makes AI ethics measurable and operational. We published the corresponding working paper in April 2020.
Since then we have been preparing further practical aids for the public sector. Many algorithmic systems which are used in the public sector can have a strong impact on the lives of individuals and society. By August 2020, we want to show how the Algo.Rules can provide orientation in the planning, development and use of algorithmic systems.
However, the process will not end there. We call on interested parties to work with us in developing the Algo.Rules further. We invite others to borrow, adapt and expand them, and above all to find ways to implement the rules in a real-world context. The Algo.Rules are and remain a dynamic process.
A professional ethic for programmers would not be enough, as they are not the only ones who play a crucial role in the process of shaping algorithmic systems. Many other professional groups, from company executives to the users themselves, are just as relevant. We do not want to look simply at the code of a program. Rather, we want to examine how the application is socially embedded, who it will affect and how it is implemented. The Algo.Rules are oriented toward the process of algorithmic systems. This is one of the strengths of our approach: that it addresses multiple target groups together. The target groups relevant in this regard are described in greater detail both here and in the preamble.
The Algo.Rules do not claim to be applied to all algorithmic systems. They apply only to those that have a direct or indirect, but always significant, impact on people’s lives or on society. For example, this includes software used to filter job applications, make recommendations to the police or to courts, or make decisions in disease-diagnosis settings. Thus, our approach is not about establishing design rules for all algorithmic systems but only for those that are relevant in this way. A recent assessment of potential impact on social inclusion, carried out by Ben Wagner and Kilian Vieth on behalf of the Bertelsmann Stiftung, offers an example of criteria that could guide this kind of relevance evaluation. Among other factors, this depends on who is implementing the algorithmic system, whether it is making decisions about people, what kinds of larger processes and in what area the algorithmic system is embedded in, and what consequences the system’s decisions might have for individuals’ lives or within society more broadly. In this regard, we classify systems at different levels: The stronger the algorithmic system’s potential influence on people’s lives or on society, the more important it is that the Algo.Rules be followed, and the more stringently the system should be reviewed.
We have looked closely at sets of principles and criteria that have previously been published or are currently in the development phase. Most of these were created in the USA. In a previous study, we analyzed three of these in greater detail, including the “Principles for Accountable Algorithms and a Social Impact Statement for Algorithms” produced by the FAT/ML Conference, the Future of Life Institute’s “Asilomar AI Principles,” and the “Principles for Algorithmic Transparency and Accountability” published by the ACM U.S. Public Policy Council. Our work additionally incorporated findings derived from analyses of additional compendia and from exchanges with numerous other projects. The AI Ethics Guidelines Global Inventory provides a good overview of all initiatives.
The Algo.Rules represent a complement to and a further development of existing initiatives. They differ from other compendia in two respects: First, they are not oriented solely toward programmers. Rather, they address all persons who are involved in the development and application of algorithmic systems. Second, they include formal design rules, while eschewing reference to moral norms. To a certain degree, this makes them universal.
The Algo.Rules emerge from a European cultural context. However, they have also been influenced by international discussions regarding the ethics and value-oriented design of algorithmic decision-making systems and applications. The version presented here has thus far been drafted primarily by actors active in Germany. However, they are oriented toward a public beyond Germany, and are thus being published in other languages as well. As the Algo.Rules do not contain moral norms, they are to a certain degree universal, and applicable within a variety of cultural contexts.
Aside from the effort to address a diverse and wide-ranging target group, the challenge facing the Algo.Rules rests in their actual implementation. Our analysis has shown that many of the other existing compendia of criteria have failed because their drafters failed to focus specifically on the practicalities of implementation, or because their implementation strategies never bore fruit. This is precisely why we develop practical support resources. The first step is specifying the Algo.Rules for the target groups of developers and executives. We are currently working on the application of the rules in the public sector. For this purpose, we are developing a handbook for public institutions to implement the Algo.Rules in the planning, development, procurement and use of algorithmic systems.
The Algo.Rules are an offer to the public and a possible starting point for many relevant processes. The guide for developers and the impulse paper for executives show what the Algo.Rules mean for specific target groups. Important steps have thus already been taken. The next and most important step is the concrete implementation. Here we focus on the public sector. Whether it is predictive policing, the automatic processing of social welfare applications or the sorting of unemployed people – some of the particularly hotly debated algorithmic systems come from the public sector. Systems that are used there tend to have a major impact on the lives of individuals and on society. Therefore, it is particularly important to create a basis for ethical considerations, for the implementation and enforcement of legal frameworks in the public sector through the use of the Algo.Rules. To this end, we analyze various case studies as well as the process of public planning, development and use of algorithmic systems. We consult and interview experts from the public sector. The aim is to make the Algo.Rules practically applicable and to develop proposals for implementation measures.
The Algo.Rules remain an invitation: Anyone who wants to help us create more specific versions of the Algo.Rules, or who has an idea regarding applications that could be used to test the implementation of the rules, should reach out to us using our contact form.