Reconciling legal and technical approaches to reasoned and contestable administrative algorithmic decision-making

Partners: Utrecht University & Tilburg University
Type: PhD
Duration: 2023—2027

Algorithmic decision-making has come to stay. Algorithms are making their way from less controversial domains such as recommending music and video content to more complex domains such as administering child support, grading students and deciding on where to focus limited public resources. However, when entering the public domain, algorithms become subject to procedural safeguards intended to protect people from arbitrary state action. These procedural safeguards have been developed in the analogue setting where decisions are made by human officials. Whether or not these safeguards are able to protect fundamental rights when public decisions are increasingly made by – or with the help of – algorithms, remains to be seen.

One such procedural safeguard requires that administrative decisions which impact the rights and freedoms of individuals have to be reasoned. In the EU, the duty to give reasons derives from Article 41(2)(c) of the European Charter of Fundamental Rights. On the national level, the source of the administrative duty to give reasons differs per country, but is expressly recognised in a number of EU jurisdictions.

Providing reasons for administrative decisions is important on at least two counts. First, it enables decision subjects to ascertain on which grounds a decision has been made and whether they want to contest it. Second, a reasoning is necessary for competent courts to exercise their power of review. Consequently, the duty to give reasons directly contributes to making administrative decisions contestable.

Administrative algorithmic decisions – whether partially or entirely automated – have to be reasoned and contestable as well. Reasoning algorithmic decisions which rely on complex algorithms, or what some may call artificial intelligence (AI), presumes an understanding of how an algorithm has reached a specific outcome. Algorithmic decision-making involves a process that is different from human cognitive reasoning. Making this process understandable to the outside world is the aim of scholars in the field of Explainable Artificial Intelligence (XAI). However, even though considerable advances have been made in explaining the functioning of algorithmic systems, not all system functionality can always be made comprehensible.

This project investigates what kind of reasoning does the right to a reasoned and contestable decision require when public decisions are made by – or with the help of – algorithms. This interdisciplinary question will be studied by using doctrinal and comparative legal research methods while considering the technical limits of explaining algorithmic output. The requirements of the administrative duty to reason will be analyzed both under EU and select national laws, with the aim to uncover whether existing legal frameworks are able to sufficiently protect fundamental rights when algorithms are used in administrative decision-making. At the same time, it will be explored whether the extent of the required reasoning is dependent on specific decision characteristics such as, for example, the type of the decision, consequences of the decision or the type and role of the algorithm used in decision-making.

The goal of this project is to take the increasingly automated society towards a better understanding of what are – or should be – the legal boundaries to explaining algorithmic decisions in the public sector. Clarifying the legal boundaries to explaining administrative algorithmic decisions would enable developers to design systems which have embedded the required degree of legal protection. Furthermore, it would help practitioners in the public sector to better assess when and how is using algorithmic tools in their work appropriate.

Within the Algosoc consortium, this project falls under the Governance element in the Justice sector. By focusing on the contestability of administrative decisions, this project studies the prerequisites of citizens’ access to justice in the administrative domain. At the core of this project are the values of transparency and accountability of public action as well as perceived fairness of public decision-making.

Subscribe to our newsletter and receive the latest research results, blogs and news directly in your mailbox.

Subscribe