About us / Case study

Mandating Algorithmic Impact Assessments

“The Canadian Directive on Automated Decision-Making currently focuses on protecting federal institutions and their clients when algorithms are used to inform decisions for service delivery. I recognize a gap in offering similar protection to the workers in the federal government. As we look to update the Directive to address this policy gap, the IFOW reports are proving to be a rich source of relevant information and recommendations." Benoit Deshaies, Office of the Chief Information Officer, Government of Canada, November 2021

IFOW has worked with the Canadian Government to help identify particular challenges in applying the 2021 flagship Canadian Directive on Automated Decision Making. In particular, IFOW research and reports have highlighted gaps in its application, and encouraged review to ensure that the impacts of automated decision-making on work and workers are always considered. We are delighted to learn that a review of the Directive is now underway.

In turn, the UK Government is considering building on the Canadian model, which mandates Algorithmic Impact Assessments for the public sector. IFOW has made the case that Algorithmic Impact Assessments should be developed to be a systematic framework for accountability in both public and private spheres. Our policy briefing and toolkit, including draft legislation, sets out how to make this happen here.

Sign up to our Newsletter

 We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

 You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing dataprotection@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.