What does Marta Peirano think about automated discrimination? Meeting moderated by Carlos Delclós.
La Our first round table at our conference Artificial Intelligence and Human Rights: Utopia or Reality? focused mainly on technology and power. We spoke with Marta Peirano, author of The Enemy Knows the System. Peirano’s book is a must-read for understanding the challenges society faces from technology.
Delclós asks Peirano what an algorithm is and why the term is so present in the public agenda. Peirano states that “It is a set of decisions, systematic operations that allow the machine to solve problems without the intervention of a person.”
The point, according to Peirano, is that there are various types of algorithms depending on how they are programmed. There are algorithms that have been programmed by human decisions beforehand and those that have been programmed through machine learning. Peirano likens this to a programme learning to operate the way animals learn to navigate their environment and carry-out tasks.
Both methodologies present a common problem: “The algorithms we have created to take decisions that humans find difficult also learn the prejudices that have affected our previous decisions: elitism, misogyny, racism…”, stresses Peirano.
“With the algorithms we are justifying discriminatory attitudes and excusing racist, classist or sexist behaviour as the decisions are made by machines, not people” warns Peirano. At this point in the debate the concept of mathswashing was raised, referring to the notion that because mathematics is involved in decisions, algorithms are neutral. The term, popularized by Cathy O’Neil, is evident in an example given by Peirano: “How not to discriminate on a border whose reason for being is to discriminate. What an algorithm will do is set the rules to discriminate”.
In this sense, both Delclós and Peirano agreed that these algorithms further our existing prejudices and social inequalities. “The algorithms are disguising discrimination processes, they repeat them and it seems that you don’t discriminate because the decisions are apparently being made by the machine,” Peirano says.
The conversation between Delclós and Peirano continued, with many more topics covered in the full video. Adversarial networks, contact tracing, data protection law, system injustices, COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) and others.
In addition, please browse the key resources mentioned in the chat and listed below. Don’t miss them!
- El enemigo conoce el sistema (DEBATE, Junio 2019)
- Dictamen del Comité Económico y Social Europeo on The digital revolution taking account of citizens’ needs and rights (own-initiative opinion)
- Algorithm Watch: “Automating Society” (2019)
- Armas de destrucción matemática – Cómo el big data aumenta la desigualdad y amenaza la democracia, de Cathy O’Neil (Capitán Swing, 2018)
- Investigación de Propublica system to calculate the risk of recidivism in the USA
- Automating Inequality (St. Martin’s Publishing Group, 2018)
- Proyecto OpenSchufa, to understand how the credit algorithm works in Germany and to ask for transparency and change