The Institute for the Future of Work is pleased to announce a new project with the ICO looking at worker involvement in auditing of algorithmic systems. This project will be led by Abby Gilbert, the Insitute's Head of Research.
Algorithmic systems are increasingly transforming our experience of work. As the Institute’s research has demonstrated, this can have detrimental consequences for equality and fairness, and lead to a range of social, material and psychological impacts on workers.
To check these impacts, and safeguard against them, there is an emerging consensus that AI and automated decision-making tools should be audited. A range of statistical auditing tools have become available, which promise to detect and correct bias in algorithms, making them ‘fair’. However, as we have demonstrated, such tools are often not compliant with UK Equality Law, are not transparent about the definition of fairness they adopt, and do not create space for workers to reveal possible impacts of the systems.
This approach also fails to create space for workers to understand and influence how their data is processed. Auditing as it stands does not allow for fully informed decision-making by workers about trade-offs in the design of algorithmic systems.
Policy discussions have long focussed on the need for ‘meaningful’ human review of automated decisions by algorithmic systems where they have significant consequences for those they impact. As businesses look to excel in compliance and advance best practice, guidance is needed on what involvement could look like in practice.
The Institute for the Future of Work has identified guidance for businesses who want to allow workers to be involved in auditing algorithmic systems as a gap.
This is particularly important, given that at present there is:
In this project, we build on the principles of meaningful human review to consider the involvement of workers in auditing processes.
We will identify a range of methods for workforce participation in human review and auditing processes, and create a toolkit to explain why these approaches are needed. Please get in touch with firstname.lastname@example.org if you would like to discuss or be part of this work.