"We argue that a more comprehensive approach than technical auditing is needed in the use of AI for hiring, which shapes access to work. Auditing tools are not designed or equipped to address and mitigate many forms of bias, discrimination and inequality when they are detected." -Dr Logan Graham, co-author of IFOW’s ‘AI in Hiring’, advisor to IFOW’s Equality Task Force and Special Advisor on AI at No 10 Downing Street.
IFOW has been working on improving the governance and regulation of artificial intelligence and data-driven technologies at work since publication of our cutting-edge research on auditing of AI tools. This research was led by Dr Logan Graham for IFOW before he took up his current position as Special Advisor on AI at No 10 Downing Street. The research found that existing auditing tools and approaches routinely import US assumptions and are note quipped to address or even mitigate adverse impacts when they are detected.
In May 2021, the Government stated in its response to the Public Standards Committee that it would consider ‘fresh governance structures’ to ensure appropriate ‘leadership and accountability’ for AI, including AI at work. Specifically, the Digital Regulatory Forum will work to ‘identify gaps in regulation’, respond to them and cooperate on areas of mutual importance. The CDEI will take on an AI monitoring and testing role, working more closely with external partners and stakeholders and the EHRC will work with partners to develop guidance on data bias and anti-discrimination law.
IFOW will continue to develop its proposal for an Accountability for Algorithms Act.