Blog and news
October 17, 2019

Announcing the IFOW Equality Task Force

News of facial analysis software being used in job interviews across the UK has thrown equality law into the spotlight. How does our equality framework apply to new uses of the data-driven technology as it drives a multi-dimensional transformation of work? Is the protection that it offers enough to promote meaningful legal equality in the age of artificial intelligence? Is technology being used to build workplace diversity - or not?

The Institute for the Future of Work (IFOW) has established a Task Force to examine how the Equality and Data Protection Acts apply to some increasingly common uses of new technology and explore ways to advance equality in the work space other than individual rights. Reducing barriers to the opportunities of disadvantaged groups is not only the right thing to do for individuals. Discrimination in the  labour market damages growth and innovation at a firm and national level [Bell et al 2019].

The Task Force’s Terms of Reference are here and our background discussion paper can be found here. IFOW is developing a set of case studies against which to test our legal frameworks. The cases are hypothetical but based on research combined with the expertise of the group. The Task Force, which includes representation from the Equality and Human Rights Commission, will draw out key legal and policy challenges, make recommendations, and give practical advice.

Work has only just begun, but some themes and key challenges are already emerging. These invite the early attention of decision-makers.

1)    Application of the Equality Act is being tested

The case studies have highlighted gaps in our understanding of how different models of ‘people analytics’ work in practice, as well as how to identify the real reason for decisions made, and to assess how and who can make adjustments. This means that it is harder to untangle less favourable treatment on the grounds of a single, protected characteristic, such as sex, something currently required under the Equality Act. It is a challenge to establish ‘causation’ when decisions involve people analytics  - one of the reasons why IFOW has proposed use of hypothetical cases which can be unpicked openly by the expert group.

2)    Training data can reflect hidden structural inequalities  

People-analytics are trained on data sets that reflect existing structures and patterns of behaviour. These patterns mirror the complex correlations that characterise disadvantage: race, sex and disability correlate with a range of other features such as the way we behave online, where we live, our educational and employment background. The facial recognition software used in recruitment tools tends to be trained on data that reflects past recruitment practice and assumptions about who is a ‘good’ employee. Employers’ attention is shifting to focus on prediction, rather than performance – a shift which encourages decision-making based on comparisons between ‘similar’ individuals and groups.

Without attention and positive action, this may exacerbate hidden structural inequalities, exposing vulnerable individuals and demographic groups to a double disadvantage.

In light of these changes, employers and others need to think harder about the implications of how technology is used - and how decisions are made to use it - on experience of equality. The fact that machine-learning systems reveal connections between existing forms of discrimination and underlying ‘systems’ inequalities boost arguments to broaden equality protection to tackle socio-economic disadvantage.

3)    Asymmetry of information is growing

The increasing use of people analytics and data-mining practices at work, combined with the general lack of transparency about their uses, intensifies a growing asymmetry of information between employers and workers. This applies to the use of personal data and machine learning in the job advertising, application and performance management stages of employment. Research conducted by IFOW outside the Task Force indicates that this is linked to a sense of lessening control and autonomy among working people.

We anticipate that equality impact assessments and auditing tools will play an increasingly important role for employers seeking to reduce risk and for employees seeking information and assurance about best practice at work.

4)    There is no one type of fairness or discrimination

This takes us to the concept of ‘fairness.’ The FATE community has developed different concepts of mathematical fairness, such as ‘fairness through unawareness.’ Some of these are grounded in ideas of procedural fairness (fairness derived through a fair, transparent and use-friendly method of making a decision) and some of outcome fairness (fairness in the sense of reasonable and consistent outcomes for individuals and groups). Both are possible frames for auditing tools.

So far so good, but is still unclear how these different concepts of fairness relate to the requirements of equality law. What is more, recent research by IFOW partners Data Justice Lab suggests that where the law is considered during the design process for systems of people analytics, it tends to follow US law rather than national law, because of the North American origin of many analytic tools.

For the moment, encouraging employers to share more information about their use of people analytics and offering tools and guidance to support assess risk and the potential for discrimination are important steps in the right direction. These are the tasks which the IFOW Fairness Framework  has begun.

The next Task Force meeting is a technical session on the 29 October kindly hosted by the Institution of Engineering and Technology. The IFOW Task Force is chaired by Helen Mountfield QC.

This blog is an IFOW commentary and does not reflect the views of the Task Force.


Author

Share

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.