Blog and news
June 8, 2022

Response to the Digital Regulation Consultation Forum consultation on algorithmic auditing

In Spring 2022, the Competition and Markets Authority, Ofcom, Information Commissioner's Office and Financial Conduct Authority – collectively the Digital Regulation Consultation Forum (DCRF) – published a call for input to their Algorithmic Processing workstream on how the four regulators should approach regulating algorithms. The Institute for the Future of Work was pleased to contribute the following response. The findings of the call for input are available here.

H1: There may be a role for some regulators to clarify how external audit could support the regulatory process, for example, as a means for those developing and deploying algorithms to demonstrate compliance with regulation, under conditions approved by the regulator.

Regulators ought to issue guidance on the role, best practice and requirements for external audits (particularly when an external, rather than internal, audit is necessary) in terms of the purpose it would fulfil and its procedural requirements. Regulators need to work more closely to issue such guidance. Our research and analyses suggest that current regulator oversight of issues concerning impacts on access, terms and quality of work is insufficient.

The Health and Safety Executive and new Single Enforcement Body for Employment should be included in the DRCF (or alternatively consulted and work as closely as possible with the DRCF) to ensure impacts on work and people are properly understood and taken into account, as well as meaningful accountability including redress for any harms. For instance, please see our proposed draft model for an algorithmic impact assessment, including work impacts, in the policy brief attached at Annex 1.

Disadvantages of the external audit approach include the potential for a haphazard application of external audits or a paucity of application if corporations are unsure about the necessary conditions or requirements surrounding external rather than internal audits. Insufficient incentives, whether in the form of penalties or otherwise, for the conducting of external audits would compound this problem. The DRCF needs to make clear the need and purpose of external audits such that corporations can best demonstrate compliance.

In the work context, external audits should be required without the need to meet risk thresholds where algorithmic systems may impact access, terms, conditions or quality of work. Work ought to be considered a high risk domain in and of itself due to the particularly high stakes surrounding individual rights in the workplace, such as information asymmetries and the nature of the contractual relationships involved.

H2: There may be a role for some regulators in producing guidance on how third parties should conduct audits and how they should communicate their results to demonstrate compliance with our respective regimes.

The DRCF ought to provide clear and singular guidance for third parties, signed off by all the regulators rather than different pieces of guidance from different regulators. Although it is understood that the regulators in the DRCF span different remits and would thus require different types of audit,  the DRCF should, where possible, provide single pieces of guidance on the procedural requirements for audits; for example, how they should be conducted across the supply chain, how stakeholders should be consulted and notified, the processes for triggering external or internal audits, how audits should be reported on to the regulator, the public or otherwise, and so on.

While the content of the audits themselves are understood to vary according to the differing regulatory requirements of each regime, it is possible for there to be more consistency and uniformity, certainly in procedure, but also in substance too. Cross-cutting, minimum ‘red lines’ and goals are needed and should be spelled out in primary legislation. Guidance at a regulator and sectoral level providing detail should be developed and updated over time.

H3: There may be a role for some regulators in assisting standards-setting authorities to convert regulatory requirements into testable criteria for audit.

There is certainly a role for regulators here. The creation of unified guidance, as specified in the previous question, alongside guidance from some of the regulators on the substantive content of audits for differing regimes would provide much-needed clarity to corporations. We note the requirements for documentation and impact assessment pre and post market deployment in the EU AI Act in this context.

It should be noted that some aspects of the audit, particularly assessment of impacts on the nature and quality of work, will need to include qualitative methods, rooted in direct engagement with those who have lived experience of relevant harms. In this sense, not all elements of the audit can be operationalised as quantitative, universally-applied criteria. This will also be necessary as part of developing the infrastructure and processes to identify trade-offs and mitigation, once a harm has been identified.

H4: Some regulators may have a role to provide mechanisms through which internal and external auditors, the public and civil society bodies can securely share information with regulators to create an evidence base for emerging harms. Such mechanisms could include a confidential database for voluntary information sharing with regulators.

Access to relevant evidence with regulators is essential. The reverse should take place as well, such that regulators also provide information to the public, civil society bodies and researchers, with differing tiers of disclosure for different parties and interests. Current examples of limited disclosure is the proposal for the publication of aggregate, anonymised reporting on the findings of impact assessments in the US Accountability for Algorithms Act 2022 by the regulatory body, the Federal Trade Commission, on an annual basis, the Canadian AIA Directive, as well as the IFOW model attached.

A more rigorous model of disclosure would include the sharing of information from regulators to trusted partner institutions on algorithmic harms and limitations and problems with audits and impact assessments, with a view to collaboration between regulators and civil society to improve regulation and compliance.

H5: There may be a role for some regulators in accrediting organisations to carry out audits, and in some cases these organisations may certify that systems are being used in an appropriate way (for example, through a bias audit) in order to demonstrate compliance with the law to a regulator.

We agree with this statement. Please note that organisations offering simple bias audits will not be sufficiently qualified, in particular because of the need for a wider impact assessment, as proposed in Annex 1.

H6: For some regulators there may be a further role to play in expanding the use of regulatory sandboxes (where a regulator has power to do so) to test algorithmic systems in a controlled environment.

Regulatory sandboxes would be particularly helpful in the domain of work. An algorithmic audit sandbox for the impacts of algorithms on ‘good work’ would be very useful to create a better understanding of how best to evaluate, measure and mitigate threats to data protection, equality, wellbeing and other cross-cutting harms in a controlled environment, whether as a real-world trial or as a simulation. Such a sandbox would also be helpful in understanding how to positively promote equality, beyond existing negative rights and duties to avoid discrimination. This need has been highlighted by IFOW research in AI and Hiring and Mind the Gap (please see Annex 2). It would also inform the development of appropriate regulation.

Author

Institute for the Future of Work

Share

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.