DEMOCRATISING
DATAFIED
SOCIETY

A hand makes a fist, gripping colorful squares that look like bytes of data. Colorful hands hold squares that look like bytes of data.
A magnifying glass inspects a square that looks like a data byte.

contents

1.
The Issue
What are the impacts of datafying public services and automating government?
2.
Collective Action
How have civil society organisations made data systems more accountable to people?
3.
Participation Models
How might we create and enhance participatory processes?
4.
Imagining Alternatives
How should we change datafied systems to serve the public and create more just societies?

ABOUT

This website is a Data Justice Lab project based on the report, ‘Civic Participation in the Datafied Society: Towards Democratic Auditing?’. The report is the culmination of a 3-year study on advancing the role of citizens and communities in the deployment of data systems in the public sector.
DOWNLOAD REPORT
PDF (8.7 mb)

1.
THE

ISSUE

Public services and government functions are increasingly datafied and automated. That means they are provided, enforced or withheld based on analysing data about people.

Data is collected about us when we apply for benefits, through cameras when we walk down a street, when we register our children for school, or when we do our shopping online, to name just a few examples. We are categorised, profiled and scored with this data, which affects the services we receive. But this often happens without our knowledge and with few possibilities to object.

The growing reliance on data and automation leads to a transfer of power away from citizens. This raises concerns for democracy. How, then, can people intervene and have a say? How can we democratise the datafied society?

A person on their phone walks past another person. Both appear to be viewed through the square lens of a camera.

Automated Decision Systems Underpin the Datafied Society

Automated decision systems (ADS) are used by companies, local authorities and other state agencies to analyse that data, make predictions about citizens’ needs and actions, and inform how public services are delivered. They are also used to assess citizen behaviour and initiate state responses.

Governments are collecting and analysing a wide variety of data about citizens, and using that data to make decisions about public services—such as social security, health and housing—and state interventions—such as policing and criminal justice. This data may include everything from socio-demographic data such as nationalities, living conditions and income, to behavioural data and social media use. 

Below are two examples of the use of data systems in the public sector.

Datafication Enhances State Power

Live facial recognition technology (LFRT) is increasingly deployed in public spaces. It captures biometric data of people, such as facial features, identifies them in real time, and thus allows for the permanent tracking of citizens’ movements. It has been used to target particular communities and in ways that limit free expression and association.

Yet police forces continue to roll out LFRT, usually without informing the public. Attempts to intervene or self-protect from LFRT may be punished. In May 2019, a Stratford, UK man was stopped and fined after covering his face while walking past an LFRT camera.
In 2019, the London Met's LFRT misidentified 96% of people flagged as potential criminals.

LFRT changes the relationship between state and citizens by transferring power to the state and leaving citizens exposed and subject to police action. Yet these technologies continue to be deployed without our consent and understanding of how they will impact our lives and our rights.

LFRT has also been shown to be inaccurate and biased. In 2019, the London Metropolitan Police’s (Met) LFRT misidentified 96% of people flagged as potential criminals. Researchers have documented racial and gender biases embedded in facial recognition systems. The consequences are disproportionately felt by people who are already marginalised—as in the case of a 14-year old Black child who was wrongfully identified and detained using LFRT in Romford, UK in 2019.

A person's face is reflected in a surveillance camera.
A colorful hand reaches towards a net made of squares that look like bytes of data.

Unaccountable Design Fails the Public

In 2017, the UK government rolled out Universal Credit, a single point of contact for all welfare, as part of a larger strategy to transform government through digitising services and implementing ADSs.

Universal Credit replaced 6 systems including the housing benefit, child tax credit and income support. The algorithm that determines eligibility and the amount of benefit has been shown to systematically disadvantage recipients.

The system was built to interpret monthly salaries, but is ill-equipped to understand the reality of those who actually need Universal Credit. Many workers who rely on Universal Credit are paid at irregular intervals, such as part-time and service workers. Thus, a common error in the Universal Credit system is overestimating salaries for these workers and not sending them the benefits they are entitled to.

People have said they were forced to skip meals and go hungry, while others borrowed money to cover the gap and have since fallen into debt.

Automating this system has caused severe income losses. People have said they were forced to skip meals and go hungry, while others borrowed money to cover the gap and have since fallen into debt. Uncertainty around when the next check will or will not come has caused psychological distress.

By automating welfare, Universal Credit has replaced consideration of citizens’ specific lives and circumstances with a flawed algorithm, and has made it harder for claimants to object and correct wrongful treatment.

An older person and a younger person shine a light onto a miniature government building. Squares that look like data bytes rise out of the building.

No Digital Government Without Us

Automated decision systems are all around us. Yet the gravity of their impact is unmatched by our awareness of them and our voice in their deployment.

ADSs make judgements about us according to criteria that are not transparent, with consequences that may harm us, and without giving us an open way of redress.

All this raises fundamental questions regarding the state of democracy in the datafied society. A democracy relies on people being active in their own governance. However, the public does not have much opportunity to gain understanding or have a say in what systems should be in place and whether important state functions should be datafied and automated at all.

What are avenues for people to participate in decisions about the use of ADSs by public institutions?

By exploring civil society strategies, ways to bring more participatory methods to government, and challenging the status quo on how we think about data, we can find inspiration for change.

Democratisation
over automation.
<
About
Collective Action
>
<
PREVIOUS
NEXT
>