Events
An image of people, with a graphic joining them together

Regulating for algorithm accountability: global trajectories, proposals and risks

Regulating for algorithm accountability: global trajectories, proposals and risks

An Ada Lovelace Institute virtual event with the Institute for the Future of Work

Thursday 3 December 19:00-20:00 GMT

Join the Ada Lovelace Institute, the Institute for the Future of Work and international experts in algorithm accountability to explore how we can ensure that algorithmic systems and those deploying them are truly accountable. The event will surface different global approaches, discuss them in relation to their governance landscapes, explore possible risks and consider regulatory options. Register for the event here.

Speakers and co-chairs:

  • Benoit Deshaies, A/Director, Data and Artificial Intelligence, Office of the Chief Information Officer, Treasury Board of Canada Secretariat
  • Albert Fox Cahn, Executive Director of Surveillance Technology Oversight Project and member of the New York City Automated Decision Systems Task Force
  • Helen Mountfield, Principal of Mansfield College, Oxford and Chair of the Institute for the Future of Work’s Equality Task Force
  • Craig Jones, Deputy Chief Executive, Data System Leadership group, New Zealand
  • Carly Kind, Director, Ada Lovelace Institute (co-chair)
  • Anna Thomas, Director, Institute for the Future of Work (co-chair)

The extensive use of algorithmic decision-making in all domains of social life requires specific accountability mechanisms and regulations that ensure meaningful redress. This is an especially hard task when little information on the algorithms in use is available in the public domain and their implementation rational, and the organisations ultimately responsible for their functioning, remain opaque.

While there is little consensus over what approach to take, countries across the world have started designing and applying different mechanisms to boost algorithm accountability.

In 2018, New York City launched a Task Force to make recommendations on how the city should manage automated decision-making systems. Earlier this year, New Zealand issued an Algorithmic Charter to be deployed in case of high-risk applications. Canada has developed a model of Algorithmic Impact Assessment, a scorecard that helps identify the level of risk of an algorithm and mitigation factors. More recently, the UK Institute for the Future of Work’s Equality Task Force has released a report highlighting gaps in legal protection and mechanisms for accountability, and calling for new legislation: an Accountability for Algorithms Act.

The Ada Lovelace Institute, the Institute for the Future of Work and international experts in algorithm accountability surface key concerns and relate them to the governance and regulatory landscapes of different national contexts. In this event we ask:

  • How do we ensure that algorithmic systems and the agencies and organisations deploying them are truly accountable? Is new regulation necessary?
  • What can we learn from the different approaches in New Zealand, Canada and New York City?
  • How do they relate to their respective regulatory and administrative contexts?

Register for the event

We are using Zoom for virtual events open to more than 40 attendees. Although there are issues with Zoom’s privacy controls, when reviewing available solutions we found that there isn’t a perfect product and we have chosen Zoom for its usability and accessibility. Find out more here.

A recording and summary of the talk will be available on the Ada Lovelace Institute website shortly afterwards.


Date

December 3, 2020 19:00

to

20:00

Location

Find out more

Sign up to our Newsletter

 We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

 You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing dataprotection@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.