Article

Trustworthy AI and accountability: yes, but how? - Proefschrift Mona de Boer

What the EU AI Act’s approach to AI accountability can learn from the science of algorithm audit

As artificially intelligent algorithmic systems (hereafter: ‘AI systems’) penetrate more deeply into organizations and society, legitimate concerns about their effectiveness for the full range of users are driving legislators around the world to seek ways to curb the downsides of these systems for individuals and society. Among these regulatory initiatives is the introduction of a proposal for AI regulation by the European Commission, the EU AI Act (April 2021), which is expected to become a reference point in the
international discourse on how to regulate AI systems. The AI Act, inspired by product safety regulations, introduces – among others – a set of ‘safety requirements’ for Trustworthy AI and (proportionate) obligations for all participants in the AI value chain, with the aim of enhancing and promoting the protection of health, safety, fundamental rights and the Union values enshrined in Article 2 of the Treaty of the European Union.

Specifically for providers of high-risk AI systems, as defined by the Act, there is an obligation to demonstrate compliance with the safety requirements for Trustworthy AI – with or without the intervention of independent third parties – by means of two evidential and complementary mechanisms: (a) the pre-market conformity assessment and (b) post-market monitoring. However, the AI Act’s elaboration of these two ‘proof mechanisms’ is seen as hardly providing instructions as to what these practices should entail, creating a regulatory grey area on which the relevance of the Act depends.

This research focuses on this – perceived as ‘huge’ – gap between (1) the high-level objectives and requirements of the regulation and (2) the day-to-day practice of Trustworthy AI, and makes recommendations to address the lack of methodology between them.

Auteur: Mona de Boer.

Bron: UvA.

Image credits

Header image: Pixabay - AI

Icon image: Image generated with DALL-E 2 (front cover of the thesis)

Media

Documents