This paper considers the ways to mitigate bias in automating hiring, at each stage of the process.

All the Ways Hiring Algorithms Can Introduce Bias

Understanding bias in hiring algorithms and ways to mitigate it requires us to explore how predictive technologies work at each step of the hiring process. Though they commonly share a backbone of machine learning, tools used earlier in the process can be fundamentally different than those used later on. Even tools that appear to perform the same task may rely on completely different types of data, or present predictions in substantially different ways. An analysis of predictive tools across the hiring process helps to clarify just what “hiring algorithms” do, and where and how bias can enter into the process. Unfortunately, most hiring algorithms will drift toward bias by default. While their potential to help reduce interpersonal bias shouldn’t be discounted, only tools that proactively tackle deeper disparities will offer any hope that predictive technology can help promote equity, rather than erode it.

Read here

Chosen by

Joshua Simons

Theme

Equality

Related files

Read here

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.