The Data Protection Impact Assessment as a Tool to Enforce Non-discriminatory AI

In this paper, Ivanova argues that the novel tools under the General Data Protection Regulation (GDPR) may provide an effective legally binding mechanism for enforcing non-discriminatory AI systems. Building on relevant guidelines, the generic literature on impact assessments and algorithmic fairness, this paper aims to propose a specialized methodological framework for carrying out a Data Protection Impact Assessment (DPIA) to enable controllers to assess and prevent ex ante the risk to the right to non-discrimination as one of the key fundamental rights that GDPR aims to safeguard.

Download here

Chosen by

Stephanie Sheir

Theme

Algorithmic impact assessment

Related files

Download here

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.