Welcome back to IFOW’s monthly newsletter, I hope you enjoyed the summer as much as we did and have settled back into your pre-pandemic or new rhythms for work. At IFOW, we’re trying out hybrid working with our fantastic new team to retain the benefits of flexible working in time and space.
We’re not alone. As the world has reopened, policy makers, employers and workers are looking at how we can embed better practices learned through the pandemic. The Government is considering establishing the right for workers to request flexible working from day one, which could be a first step towards developing ‘Day 1 Digital Rights,’ as we have recommended. So far, so good – but we must watch out for disparate impacts and new divides, as IFOW Head of Research Abby Gilbert discusses in an interview with the BBC here.
This week the Government released their long awaited National AI Strategy. In the context of this, I joined Beth Gutelius, Mary Towers, Ed Miliband and Geoff Lloyd on the ‘Reasons to be Cheerful’ podcast to discuss the Rise of Robo-Bosses. We look at what the UK needs to do to fulfil its ambition to lead in AI governance, as well as innovation, drawing lessons from California’s landmark bill earlier this month. The bill targets Amazon’s algorithm-driven rules by requiring warehouses to disclose the quotas and metrics used to track workers. We cover similar trends in algorithmic management in UK warehouses, based on IFOW’s ground-breaking research from the ‘Amazonian Era: How algorithmic systems are eroding good work’.
Our full response to the AI Strategy is here. We think next steps should include backing up commitments made in the strategy in the upcoming Spending Review, wider stakeholder engagement and bringing forward an Accountability for Algorithms Act.
Next month will see the launch of the APPG on the Future of Work’s inquiry into AI and Surveillance in the Workplace, established in response to the Amazonian Era and marking a step change in the debate on AI regulation at work. We’ll also be sharing the new IFOW Knowledge Hub which will feature reading material on a range of topics relating to the future of work, curated by our team and some of our favourite collaborators. We’re excited to share this excellent resource with you all.
Have a good weekend,
Institute for the Future of Work
Deep Dive: Automated Inequality and Hiring
New investigative work from Global Witness has found that Facebook’s algorithm appears to place advertisements in ways which breach UK equality law. This blows the idea that algorithms are neutral, non-discriminatory decision makers out of the water.
Research by IFOW shows algorithms are used to predict the probability that someone will click on a job advert, apply to the job and succeed in their application. The efficacy of this ‘click conversion’ is valuable as it saves the client money. An algorithmic system can process data from a variety of data sources, including data held by the platform on user’s general behaviour on the site, as well as engagement with other job adverts. Data sources are then combined and analysed to identify patterns that can reflect characteristics such as age, gender, postcode and online behaviour, such as likes, contacts and clicks.
For example, if men tend to look at managerial adverts more, and or men tend to hold managerial positions more, the algorithm will begin to show managerial roles disproportionately to men. This means that even if Facebook’s client hadn’t given overtly discriminatory instructions, as in the case of Global Witness, the systems will begin to reflect and entrench existing patterns of recruitment - and inequality.
Just as concerning, the auditing tools available to identify bias and other adverse impacts, routinely import US assumptions about the requirements of equality law into the UK. Research for IFOW’s Assessing Impacts on Equality led by Dr Logan Graham (now advisor on AI at No 10) shows that auditing tools are not clear about their approach or aimed at correcting inequalities when they are detected.
‘Automated inequality' through algorithmic decision-making at work is pervasive and shows no signs of slowing down. This loops us back to IFOWs proposal for an Accountability for Algorithms Act co-developed with Task Force members from the Turing Institute, Information Commissioner’s Office and the Equality and Human Rights Commission.Interesting Reads
IFOW have published a blog examining how automation effects employment prospects written by R. Maria del Rio-Chanona, Penny Mealy, François Lafond, Mariano Beguerisse-Diaz, and J. Doyne Farmer.
Abebe Birhane has written a new paper on the limitations of machine learning prediction when it comes to social behaviour.
Karen Yeung and Lena Ulbricht explore developments in our understanding of how algorithms can change our behaviour. At IFOW we describe this in the Amazonian Era as the 'Human Data Cycle'.
The Resolution Foundation has published their ‘Work Experiences’ report, looking at the subjective experience of work to provide a rounded picture of the changing realities of employment as policy and the economy have evolved since the 1980s to the latter part of the 2010s.
The Ada Lovelace Institute, AI Now Institute and Open Government Partnership have published global analysis of the initial wave of algorithmic accountability policy for the public sector.
You are invited to join the upcoming virtual Chatham House Future of Work conference as IFOW’s guest.
Taking place online from 7-8 October, the conference will welcome IFOW Co-Chair, Sir Christopher Pissarides, alongside an international audience of policymakers, senior business leaders and labour market experts; including:
Over the course of the two days, discussion will explore the trends transforming the nature of work, with a focus on skills, resiliency, and adaptability.
Registration is complimentary using code FOWIFOW21 for our valued IFOW network. Please select £294 at checkout and enter the registration code to receive free access. Full agenda and registration details can be found here: https://cht.hm/38p4pX3
If you have any ideas, comments, or suggestions for our round-up, please drop us a line at firstname.lastname@example.org.
Thank you for your time and interest. If you enjoyed this and know someone else that can benefit from our newsletter, please share it with them. If someone has forwarded this to you but you would like to receive this update yourself, please subscribe here.