Young person standing on large toy blocks in front of a moon and star filled sky, looking over mountains with data bits falling off the sides.


Data issues are social justice issues. How can we shift the purpose of ADSs to centre around serving the public and create more just and equitable societies?

An important part of changing how automated decision systems work is to rethink how and why we use them. Many civil society organisations, social movements and local councils are imagining new values, aims and practices around data. Below are just some of their many ideas and experiences.

Large purple mountain with bits of data falling off the sides.

Data for the People

Who do ADSs belong to?

While these systems and infrastructures are often in the hands of large corporations, a movement has grown reclaiming data as a public good. If the data that is used by these systems is largely produced by the public, it should also belong to and be accountable to the public.

Barcelona’s Data Commons

The European project DECODE developed a plan to implement this idea which was piloted in Amsterdam and Barcelona between 2017 and 2019. DECODE “aims to construct legal, technological and socio-economic tools that allow citizens to take back control over their data and generate more common benefits out of them”.

Between October 2018 and April 2019, the Barcelona City Council and other organisations carried out the “Digital Democracy and Data Commons” (DDDC) project, which involved consultations on how to deliver safer and more transparent forms of data enrichment for the people of Barcelona.

The city of Barcelona sees public data as belonging to the data commons, “data of, by and for the people”.

The city of Barcelona sees public data as belonging to the data commons, “data of, by and for the people”. The idea is for communities to govern data together. Data is a shared resource that citizens contribute to, access and use. Data collected by the city is a common good.

Citizens of Barcelona have been able to participate in municipal decision-making since 2016 via the platform (“Barcelona we decide”). They can propose their own initiatives, which then get voted on. In 2016, they co-produced a strategic city plan with the government which integrates thousands of citizen proposals. Numerous cities worldwide have now implemented digital participation platforms by either creating their own or adopting software like Decidim.

Green hand with pink data bit held between forefinger and thumb.

Procurement by the Public for the Public

Public participation in procurement of ADSs could make them more inclusive and accountable.

In the UK the algorithms and software used in ADSs are largely outsourced and bought from private companies. This means that the government often does not have a comprehensive understanding of how the automated decision system works, and also has little ability to change the system if it’s found to be problematic.

Calls for changes to public procurement from civil society organisations and city councils have emerged as part of broader debates on expanding public ownership of resources that benefit a municipality. For example, the Preston City Council, together with institutions like the Centre for Local Economic Strategies (CLES), is implementing principles of community wealth building within Preston and the wider Lancashire area in the UK.

“Community wealth building [is a] new people-centred approach to local economic development, which redirects wealth back into the local economy, and places control and benefits into the hands of local people”

Community wealth building” is defined by CLES as a “new people-centred approach to local economic development, which redirects wealth back into the local economy, and places control and benefits into the hands of local people”. Progressive procurement, that is procurement that seeks to benefit the community over simply minimising cost, is a key tenet of this process. 

Yellow hand reaching out for floating pink data bit.

Accountable Algorithms

Algorithms are sets of instructions that a computer uses to solve problems.

But from a human perspective, algorithms are things we live with and that are all around us. They shape our musical tastes, affect who we interact with, orient our political choices and influence both individual choices and collective decisions. Reimagining automated decision systems therefore needs to include reimagining the algorithms that we feed data into.

An increasing number of organisations, activists and scholars are exploring how bias, inequality and prejudice are coded into algorithms and what can be done about it. Algorithmic accountability encompasses the different attempts to understand and improve how algorithms work and what to do when they don’t work well.

Calling for Equity, Fairness and Social Justice

Organisations like the Algorithmic Justice League (AJL) use “art and research to illuminate the social implications and harms of artificial intelligence [...] and build the voice and choice of most impacted communities”. For example, research by AJL's founder Joy Buolamwini has proven bias in facial recognition systems. She works with communities challenging the implementation of these systems in their homes, seeks policy change by speaking to the UN and other international forums, and has engaged in public education through the film Coded Bias.

The A+ Alliance uses technology “to correct for historic inequity and bias”. At the heart of their work is the concept of ‘Affirmative Action for Algorithms’, whereby women and girls are more included in algorithm accountability processes as well as the surrounding funding, auditing, consultation and implementation processes. The Alliance recently launched the Feminist AI Research network which aims “to explore new models and new ways of conceiving AI that correct for historic inequities and bring social programs and policy fit for the 21st century”.

Computational scholars have explored the reverse-engineering of algorithms as a strategy to better understand how they work, improve transparency and advance participation. A team of researchers has designed an app that can replicate how algorithmic systems work to show how our choices are greatly shaped by algorithms.

Orange fist gripping two data bits.

Power Structures

Data and algorithms are themselves the product of longer histories of structural inequality and systemic violence.

They are part of the broader context of data capitalism where “companies use data to preserve the power imbalance” between wealthy and disadvantaged sectors of society.

Data for Black Lives and Dismantling Big Data 

Movements like Data for Black Lives (D4BL) work “to make data a tool for social change instead of a weapon of political oppression”. This is achieved by reclaiming not only data infrastructures, but the power structures that enable datafied racial inequality. 

Key for this movement is abolishing big data. In the words of D4BL’s Founder and Executive Director, Yeshimabeit Milner: 

Abolition isn’t just about destroying, it’s about creating something new, and it’s also about understanding that, just like in the movement to abolish prisons, that prisons aren’t the solutions to society’s problems. And also, most importantly, that part of abolition is also, just like with the call to abolish big data, to dismantle the structures that concentrate power and wealth and resources into the hands of a few.
Sun emerging from behind clouds.

Questions for the Future

There are different ways we can challenge automated decision systems, different opinions on how to move forward, and different visions of what the future of data and algorithm use should be.

Can we reform and improve current systems? Can we correct algorithms for historic inequity and bias? Can algorithms be tools of social transformation if they are reoriented towards a different goal, like fairer resource allocation?

Or is inequality so deeply embedded that we must seek more radical solutions? Must we do more than just rewrite an algorithm, even if that process is participatory? Is it even naive to apply a technical solution to a deeper historical problem that needs to be addressed in the political, social and economic, rather than the technical realm?

Are controversial technologies, such as live facial recognition, acceptable if clear regulatory frameworks and oversight exist to mitigate their potential harms?

Or do the harms of these technologies for affected communities and the wider public necessitate a definitive end to their deployment?

Should citizens be empowered to produce their own technology? Can we advance public control over data systems through individual empowerment and data literacy? 

Or does this put the onus on individuals, when it should be the role of government to create a regulatory environment in which citizens are protected and data systems work for the public good? 

Discussions on issues like these animate current conversations in academia and among civil society organisations but rarely, so far, among the wider public. Intervention and participation in the datafied society need to start from public debate about the technologies transforming our societies. 

Define the data future.
Participation Models