Is it possible to (de)racialise AI and re-appropriate technology?

As we have been explaining in previous articles, at the end of May we co-organised the JornadasDAR Democracy, Algorithms and Resistances with Algorace, Algorights and Over three days we were able to enjoy and learn in lectures and talks about a more democratic, decolonial and respectful of human rights artificial intelligence. Here you will find the summaries of the talks that took place in the Conciencia Afro space in Madrid and remember that you can retrieve the full videos here.

How to (de)racialize artificial intelligence? 

Ana Valdivia, Youssef M.Ouled, Javier Sánchez and Paula Guerra Cáceres, members of Algorace, opened the session in Madrid with this round table. During her intervention, Paula Guerra Cáceres asserted that “artificial intelligence is within the framework of structural racism and makes the person disappear and replaces them with decisions according to historical patterns and prejudices”. 

So what needs to be done to (de)racialise AI? According to Guerra Cáceres, “we have to educate about how it affects us and which public administrations and companies are developing these technologies”. To de-racialise artificial intelligence, Guerra Cáceres proposed some ideas: 

  • encouraging debate on the scope of the use of AI
  • to know what use is being made of these technologies by public administrations and private companies
  • Involve groups of people affected by AI.
  • promote the inclusion of racialised people in AI-related careers, for example with scholarships.

According to Algorace, there is a hegemony of white profiles in technical positions that develop artificial intelligence. That is why an agenda of racialised professionals is being drawn up within the collective, “because other perspectives are needed to be able to design fairer algorithms”, Sánchez justified. 

Another interesting aspect of the collective’s work, which brings knowledge closer to the general public, is the fact that it compiles problematic artificial intelligence systems and explains them in plain language. According to Sánchez, they also challenge the Anglo-Saxon technological philosophy and look for cases that are closer to home so that people can identify with European experiences and logic. 

“We have to be clear that Artificial Intelligence has a patriarchal and colonialist origin, at the service of the powerful,” Valdivia said. As a solution, Valdivia proposed that organisations and social movements should re-appropriate technology and artificial intelligence systems and use them to their advantage. 

If you want to know the sections of the report that Ana Valdivia and Javier Sánchez are working on together with the rest of the Algorace team and the answers to the questions posed by the audience, we recommend you to watch the full video here!   

How to incorporate a digital perspective in an anti-racist organization?

GHETT’UP‘s Safia Oulmane talks to Algorace’s Miriam Hatibi about the challenges that anti-racist organisations face in incorporating digital mechanisms without putting their organisations and members at further risk. 

What areas does GHETT’UP focus on?

  • changing narratives about working class people and people living in the banlieues
  • empowering young people in these neighbourhoods
  • advocating for social and environmental justice
  • fighting racism

At GHETT’UP, Oulmane explained that they are also working with the Justice, Equity and Technology project at the London School of Economics and Political Science to understand the technological component within the Global Security Law in France proposed in October 2020. A law that, as she explains, was created to give more power to the police and “justify the use of drones in demonstrations” among other things.

For Oulmane, it is essential to do all this work to find out how the relationship between the police and technology affects them in popular neighbourhoods. In addition, Oulmane explained that they are collecting data on these neighbourhoods in order to have their own story. “We have to explain our reality directly, otherwise they approach us as if we were a zoo. The work behind the group struggle is exhausting, but we have to create our own data so that they don’t talk about us and stigmatise us”, she said. 

On the other hand, Oulmane stressed that in addition to fighting and resisting, it is important to show positive things about these neighbourhoods and their people, so that “the new generations feel proud”. “It is exhausting work, both physically and psychologically, and all this has to be done as a community. Don’t give up, work as a team and remember that care is politics,” Oulmane said when asked by Hatibi about the importance of finding time to take care of yourself in the anti-racist struggle. 

Conversation “If we want a future in freedom, we have to re-appropriate technology”

At the end of the afternoon in Madrid and almost as a culmination of  JornadasDAR we welcomed Cinthya Rodriguez from the collective Mijente. A collective that defines itself as a new political home for the Latinx and Chicanx organisation.                       

Cinthya explained the connections between big tech companies and the US immigration department for the surveillance and deportation of migrants. She also presented the #NoTechforICE campaign, which denounces that technology companies created means to enable surveillance, imprisonment and deportation of migrant communities. 

“We see that the technology that the police have used to monitor undocumented people is the technology that they have then applied to the entire population. We need to understand how the migration system and the police work, and fight against both”.

How to build a movement for a future free of surveillance, asked Rodriguez. 

As Rodiguez explained, Mijente focuses on issues such as:

  • who sells the data collected by the technological systems
  • which companies analyse the data and what they then do with it
  • where the data is stored and what the implications are of data in the cloud
  • how biometric data is generated and what happens to the 200,000 people in alternative-to-detention programmes 
  • who is investing in these data collection technologies and why are they doing so?

Finally, Cinthya Rodriguez launched a message of struggle and collective resistance: “The way we are going to win with the #NoTechForICE campaign is by building people’s power“.

To end #JornadasDAR, we split into groups to reflect on the decolonisation of AI, the democratisation of AI and the use of AI in borders and migration, just as we did the day before in Barcelona.

Remember that you can now watch all the videos of JornadasDAR Democracy, Algorithms and Resistances and listen the interventions of the three days.

Do you want to get the Societat Oberta agenda on your e-mail?