Blog and news
November 11, 2020

Black Lives Matter. In the past, in the present, and in the future.

October is Black History Month, which celebrates the contributions of people of African and Caribbean descent to British history.  This year it has more attention than usual, coinciding with the birthday of George Floyd and a renewed global Black Lives Matter movement.

An additional backdrop has been provided by the pandemic:years of structural inequality of work and health continue to be reflected in coronavirus deaths. For the first time, this has been recognised at a national level by the Office for National Statistics and Public Health England.

Given the circumstances, I’ve had to take heart and see the positive in more folks taking time to look forward. There has been a building of awareness of the depth of racial injustice, more understanding of intersectionality and a willingness to begin proper journeys of equitable practice towards belonging at work across sectors. This contrasts sharply to the empty actionless ‘Diversity and Inclusion’ rhetoric of previous years.  

I am a technologist – and passionate believer in the potential power and opportunities of technology when it is well designed, well used, and well shared. Unfortunately, I’m confronted with technology cases that fall short. The stakes are now too high for us not to build technology well –and take equality impacts more seriously.  

There’s the facial recognition algorithm on the video-meeting platform Zoom, which erased the head of a man because it could not distinguish his black skin. Or the Twitter photo algorithm that ‘prefers’ lighter images (including lighter skin) for tweet previews. The British passport renewal service that asks darker skinned citizens why their mouth is open, when it’s not.

With the right practice and great tech teams, these issues are easy to spot.

Others are not – which makes them more pervasive and dangerous.

Let’s take the example of data-driven HR technology. These tools are often programmed to identify ‘good’ employees.

These algorithmic systems reflect existing patterns of recruitment, taking into account things like personality tests, the sound of people’s voices, behaviour on social media, educational background and where people live.

It’s almost impossible for someone who suspects they have been discriminated against to know whether their race, sex, or socioeconomic background has caused maltreatment.

They may not know anything about the system used, or its purpose; they cannot see the internal process by which the machine has reach edits conclusion, and often there’s no means to ask. 

This is not acceptable and it has to stop. Current mechanisms for governance and regulation cannot be adequate if action depends on passing corporate social responsibility policies.

We have well over 80 ethical codes which have established norms and a broad direction of travel – but plainly, they aren’t enough to meet the biggest challenges we face, let alone the challenges of the future.

We must introduce a requirement for equality impact assessments and necessary adjustments, on an ongoing basis, at the earliest point in the technology cycle.

This idea, of a new pre-emptive corporate duty to evaluate adverse impacts on equality, and take reasonable steps to stop them, is at the centre of a proposal from theInstitute for Future of Work  (IFOW) fora new Accountability for Algorithms Act.

Technical transparency isn’t enough. What we need is to ensure that technology is designed in a way which puts people, and equality, first.

Equality should be a core principle of the way we design and use technology, not an afterthought.

The Institute for the Future of Work setup an Equality Task Force to explore these issues by identifying the effects ofAI on equality. In its new report for the Equality Task Force: Mind The Gap : How to Fill the Equality and AI Accountability Gap in an Automated World, IFOW finds that algorithms are usually created by developers who work without considering the sweeping consequences to large groups of people of their decisions.

But adjusting equality and data protection laws is not enough. New primary legislation is urgently needed.

Regulations should require rigorous, advance assessment and consideration of collective, as well as individual, harms and impacts on equality. This means going further than traditional, single axes of discrimination.

Over the past few months, people seem to be waking up to systemic racism and its effects on society. We must continue to battle these injustices, but we also need to be proactive in ensuring they do not become entrenched in the codes and algorithms of our collective futures.  Let’s celebrate Black history through our actions now, to ensure the future is worthy of celebrating too.


The photo used for this article is from UK Black Tech, licensed under Creative Commons 2.0

Author

Anne-Marie Imafidon MBE

Share

Sign up to our Newsletter

 We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

 You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing dataprotection@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.