Front cover of Good Work Algorithmic Impact Assessment report
March 28, 2023

Good Work Algorithmic Impact Assessment

Artificial intelligence (AI) and algorithmic systems are increasingly used in the workplace, and designed, developed and deployed in ways which can transform people's access to work, the conditions under which they work and the quality of the work they are employed to do.

When well designed, these technologies offer new opportunities to increase efficiency, augment capacity and drive growth. But this transformation is also driving a wide range of social, psychological and material impacts.

Whether it's about how their rights are respected, how their working conditions are likely to change, or how their interests are balanced with those of the business, workers need confidence that these systems are being used fairly and transparently.

Supported by the UK Information Commissioner’s Office (ICO) Grants Programme, this guidance is designed to help employers and engineers to involve workers and their representatives in the design, development and deployment of algorithmic systems so that risks are anticipated and managed, 'good work' is promoted, the law is complied with, innovative approaches are unlocked and trust in technology is built.

As a complement to it, we have produced two resources to help improve accessibility and understanding of the ways in which algorithmic systems can impact work.

First, the Good Work Charter toolkit identifies ten dimensions of 'good work', and outlines the main legal and ethical frameworks that apply.

Second, 'Understanding AI at Work' provides accessible explanations of how human choices in the design, development and deployment of AI at work are determined by human choices.

Together with this guidance, these resources will help employers assess the wide range of impacts that AI and other algorithmic systems may have on Good Work.

The publication of IFOW’s Good Work Impact Assessment marks a major policy breakthrough for ensuring that the design, development, and deployment of work-related AI technologies is equitable, responsible, and trustworthy. As more and more of everyday work life is impacted by the use of AI systems, the GWAIA provides AI developers, procurers and users a very accessible and proportionate pathway to implementing best governance practices. The uptake of this guidance, both in the UK and globally, will prove crucial to the future of good work, and any organisation, big or small, considering the use of work-related AI would do well to put it into practice.
— Professor David Leslie, Professor of Ethics, Technology and Society, Queen Mary University of London

The Good Work Algorithmic Impact Assessment represents an important contribution to advancing responsible innovation in the context of workplace AI, forefronting good work in shaping a responsible and trustworthy AI ecosystem. We look forward to continued work with IFOW that draws on this expertise on AI applications in the workplace context.
— Dr Florian Ostmann, Head of AI Governance and Regulatory Innovation, The Alan Turing Institute

Authors: Abigail Gilbert, Anna Thomas, Gwendolin Barnard, Stephanie Shier

Permission to share: this resource is protected under a Creative Commons 4.0 (CC: BY-SA) license.

You may copy, share or adapt this resource but you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests IFOW endorses you or your use. If you remix, transform, or build upon this material, you must distribute your contributions under the same license.

ICO Grant Recipient Logo
Front cover of Good Work Algorithmic Impact Assessment report
Download the guidanceDownload the guidance



Publication type



Prioritising people

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing Read our full privacy policy including how your information will be stored by clicking the link below.