Public values for citizen’s access to justice: How automated decision-making systems challenge the values underpinning fundamental rights and the way to preserve them

…if I’m to give you advice I’ll have to know what it’s all about,” said Miss Bürstner. “That’s exactly the problem,” said K., “I don't know that myself.” Shocking as it may sound, the ordeal of Josef K. in Kafka’s “The Trial” may not be far from a dystopian reality. Artificial Intelligence (AI) and Automated Decision-making Systems (ADS) may use population data to create profiles in order to place individuals in groups (e.g. determine who will be categorised as a potential suspect of crime); or they may rely on criteria based on which individuals are awarded with or deprived of a benefit (e.g. determine who will receive a loan and who will be denied financial services). Such systems raise ethical and legal issues relating to justice, such as rights and democratic values including fairness, transparency, non-discrimination, consistency, good governance, equality of arms, a fair trial or effective remedies. But what does ‘justice’ exactly mean in this context? And who does this ‘justice’? What do the notions of ‘justice’ and ‘rights to justice’ entail in the technologically changing world and how do they relate to the notion of ‘access to justice’? How do the values underpinning fundamental rights inform justice and justice rights? How and to what extent are rights to justice and the values informing them affected by ADS? What checks and balances are required in order to preserve citizens’ justice rights?

With these crucial questions in mind, this project will first examine the notion of justice (both within its procedural and institutional sense and beyond) and its relationship to the concept of access to justice. It will take account of the ever more pressing question of the allocation of justice-related tasks between the state and non-state actors. It will reflect on the phenomenon of privatisation of justice resulting from the engagement of big tech companies, their services or products in the performance of tasks traditionally ascribed to the public institutions.

It then will analyse how the values underpinning fundamental rights – such as dignity, autonomy, equality, and others – inform the proposed considerations on justice and how and to what degree the values in question are affected by ADS.

Finally, the project will explore what checks and balances are needed to preserve citizens’ rights to justice, including justice access rights, when public values and fundamental rights are being challenged by the fast-developing technology, rendering the fictional vision of Kafka a reality.

Ultimately, the goal of this project is to determine and critically reflect on how public values underpinning fundamental rights are affected by ADS and propose safeguards that are needed to preserve citizens’ rights to justice.

In doing so, the project will contribute to the overall goal of the AlgoSoc program, namely to finding the answer to the question of how public values can be realized in the algorithmic society. It will specifically address the challenge of conceptualization and articulation of those values in the changing patterns of institutional and individual decision-making power shifting in the algorithmic society. Although the analysis will be carried out in the AlgoSoc ‘justice’ sector, many of the findings should prove universal and apply also to the health and media domains.

The project will largely rely on the use of traditional legal research methods such as doctrinal legal method with elements of legal philosophy and empirical legal research. However, the multifaceted character of the examined phenomenon and need to address it comprehensively will demand a careful study of relevant scholarship from other disciplines than law, particularly philosophy (especially ethics and political philosophy), science and technology studies (STS) and computer sciences.

Subscribe to our newsletter and receive the latest research results, blogs and news directly in your mailbox.

Subscribe