Table #4 Predictive Tools

Predicting the future: predictive tools and segmented societies

Our fourth roundtable at the Conference on Artificial Intelligence and Human Rights was led by four professionals. Trends in the application of Artificial Intelligence in crime prevention and the administration of justice were discussed. In addition, it had a certain futuristic look as we tried to anticipate somewhat dystopian scenarios in which justice and predictive technology are mixed. Our main question was: What are the opportunities and risks offered by Artificial Intelligence and how can we prevent it from violating political, social and civil rights?  Cristina Goñi moderated the following guest contributors:

  • Juan José López Ossorio, head of the Studies area of the Coordination and Studies Cabinet of the SES (Secretary of State for Security))
  • Gemma Galdon, expert in technological social impact and president of the Eticas Foundation
  • Andrés García Berrio, expert lawyer in civil and political rights of IRÍDIA
  • Jorge Morell, head of the Legaltechies consultancy and expert in predictive justice

Jorge Morell began the round of presentations with a question: predictive justice, fact or fiction? “Many people think of Minority Report when they hear about predictive justice, because in the film you could predict crimes before they were committed.” Morell warns that, although we are not yet at that point, there are some signs that we are on the way and he gives the example of a news item on ECtHR rulings and algorithms as an example. But what is predictive justice? Morell defined it as “the middle ground between Big Data, data analysis and the use of algorithms“. If you watch the complete video you will find a list of balanced analysis of the benefits and downsides of this system, which Morell extensively outlines with real and very interesting examples.

Technology against gender-based violence

In his turn, Juan José López Ossorio presented VioGén, the system of the Secretariat of State for Security of the Ministry of the Interior that uses technology in the fight against gender violence. López Ossorio presented this tool, which has been in use since 2007, as an “example of how technology can help in the face of a socio-criminal problem”. In this regard, he highlighted four pillars with which the Secretariat works: scientific studies, institutional coordination, specialized units and technology. In addition, he emphasized that work is being done tirelessly to improve and grow the tool, paying special attention to “detecting blind spots and seeing to what extent we can improve, especially in terms of risk assessment.”

Alluding to Morell, Galdon recalled that “the history of the police has been closely linked to that of technology and there has always been debate about this relationship and security. And, if Morell mentioned Minority Report, Galdon alludes to The Wire as a fictional reference in which these issues were addressed. “Will technology save us from crime?” he wonders. “The police are not leading on AI issues, and other sectors of social administration are,” Galdon said. In addition, Galdon stressed that the use of predictive or advanced technologies in police matters is much more sensitive and dangerous than in other fields such as consumer marketing. “It’s not the same as an algorithm deciding which brand of yoghurt to show you when you shop online – it sends you to jail.” Finally, and making explicit mention of the title of this table, Gemma Galdon commented: “People have too high expectations about the capabilities of technology. It needs to be demystified a little bit. Algorithms cannot guess things, they only act on facts that have already happened.”

Polarized societies

The meeting was closed by Andrés García Berrio, who wondered how algorithms can be generated that do not accentuate existing social inequalities. “It is more difficult to build plural societies when the algorithms only show limited thought,” said García Berrio, “One of the issues in criminal justice that worries us most is that we are returning to the individualization of criminal responsibility,” he said. “We are forgetting that there are factors that determine why there is more representation of migrants in the criminal system than of white people. And in fact, as has been seen throughout these days, decisions related to tools for predicting future crime are not fortuitous or casual. “These mechanisms are made by people and it is they who introduce values or discrimination into these tools”, recalled García Berrio. In the final part of the intervention, we talked about RisCanvi, a programme designed to prevent violence in the prison environment in Catalonia. You can hear more about this programme in the recording of our round table, don’t miss the whole video

And if you get to the end of the video you can also listen to the group’s reflections, prompted by questions from the audience. Here are some links to several resources mentioned during the event:

Do you want to get the Societat Oberta agenda on your e-mail?

Subscribe