Our work

Responsible AI Resources

Resources to help firms and organisations adopt AI to promote good work.

Advancing innovation and social good

We are optimistic about the future of work, but believe that action must be taken to ensure that innovation and social good advance together. The toolkits provided here are to help businesses achieve this by building and protecting good jobs as they look to invest in AI technologies.

If you are interested in being one of these businesses, please do email us.

The Good Work Charter Toolkit sets out the legal and regulatory frameworks that exist to support the ten fundamental principles in the Good Work Charter that should be protected and promoted whenever new workplace technologies are introduced.

Understanding AI at Work helps employers and workers understand how human choices in the design, development and deployment of AI systems can impact these Good Work principles.

The Good Work Algorithmic Impact Assessment presents an approach for involving workers in the assessment of AI systems so that Good Work principles can be built and sustained.

Going further - our Responsible AI Sandbox

AI and algorithmic systems are transforming work in profound ways. These data driven technologies have the potential to create new, good jobs and improve access to good quality work, but they can also have adverse impacts.

As we navigate this technological transition, businesses, system engineers and workplace representatives need practical tools to improve understanding of the risks and opportunities that AI presents to make sure that a fairer future of better work is built.

Launching in early 2024, The Sandbox will offer firms and employees guidance and methodologies to identify problems and build practical solutions that will preempt the risks and enhance ‘good work’ when designing, developing and deploying AI systems.

Businesses have real choices when designing, developing and deploying new technologies. With a sharp focus on good work impacts, The Sandbox will help firms and employees make those better choices.

There is good evidence that businesses that complement AI with investment in human capabilities see better productivity returns.

Our unique range of expert resources aligned with regulatory best practice and the Good Work Charter will allow businesses to forefront the voice and experience of stakeholders from all levels of the workforce.

We invite those who want to be trailblazers in responsible AI to work with us. If you share that vision, or would like to explore investing in it, we’d love to be in touch.

"A suite of highly practical, and accessible tools ... this is how we empower business to future-proof responsibly." - Dr Anne-Marie Imafidon MBE, IFOW Trustee and Stemettes CEO

"The IFOW toolkits fill an important gap. They will help leaders be responsible with AI and feed into development of good regulation." - Lord Jim Knight, co-chair of the All Party Parliamentary Group on the Future of Work

"The Good Work Algorithmic Impact Assessment represents an important contribution to advancing responsible innovation in the context of workplace AI." - Dr Florian Ostmann, Head of AI Governance and Regulatory Innovation, The Alan Turing Institute

"The publication of IFOW's Good Work Algorithmic Impact Assessment marks a major policy breakthrough for ensuring that the design, development and deployment of work-related AI technologies is equitable, responsible and trustworthy." - Professor David Leslie, Professor of Ethics, Technology and Society, Queen Mary University.

No items found.
No items found.
No items found.
No items found.
No items found.
No items found.
No items found.
Register your interest
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.