Civil society organisations are at the forefront of exposing the impact of data systems, but they also face challenges in making real change and empowering citizens and communities.
Here are 5 examples of strategies from civil society organisations for making data systems more transparent and accountable to people.
If governments were transparent by publicly declaring and publishing the data systems they are using, civil society would have greater ability to highlight the potential impacts before harm occurs.
Yet without a public and active register of government data systems researchers, journalists and civil society organisations use the provisions within General Data Protection Regulation (GDPR) and Freedom of Information Requests (FOIs) to uncover how data systems are being used. Under these rules, members of the public are entitled to request information from public authorities.
GDPR gives you:
In Finland, Digirights.info offers free online courses that help citizens build the skills and knowledge to exercise GDPR rights. These rights include the right of access, the right to erasure and using subject access requests (SARs).
Citizens can use the website to provide details of any SARs they make and the feedback they receive from organisations, including how long the organisation took to respond. This allows citizens to collectively identify organisations not complying with GDPR and helps Digirights build legal cases.
In order to effectively use GDPR, civil society organisations have called for a legal provision for collective redress.
Article 80(2) is the GDPR provision, which allows NGOs to represent data subjects en masse and without seeking a mandate from them. Collective redress, as opposed to individual lawsuits, is a more efficient means of tackling data harms because of the sheer scale of data protection breaches which affect larger groups of people, rather than single individuals.
This provision, though, has not been transposed into the UK’s Data Protection Act.
In March 2018, a coalition of privacy and civil rights groups in the Netherlands brought a lawsuit against the Dutch government’s use of System Risk Indication (SyRI), a data analytics system used to assess risk of welfare abuse and tax fraud.
Not only was this system designed in such a way that it placed all citizens under “general suspicion,” the Dutch tax authorities also wrongly accused more than 26,000 families of committing fraud. Why? Because the system had an overtly strict interpretation of what fraud meant. Any minor document mistake resulted in being classified as fraudulent. Thousands of families were pushed to bankruptcy.
The lawsuit was successful and a court order was issued to stop the use of SyRI.
In the UK the vast responsibility of overseeing the deployment of data systems falls on the Information Commissioner’s Office (ICO).
It’s the main public body with the power to change how different actors collect and use data.
However, there is no formal way for the public to bring an issue to the ICO’s attention, except as a complaint concerning an infringement of personal data rights based on GDPR. While individual participation is mostly reduced to ad hoc petitioning, civil society organisations and coalitions can exert more power.
Amnesty International called for the ICO to investigate the London Metropolitan Police Service’s use of a database known as the gangs matrix. The police used the gangs matrix to target and surveil suspected gang members.
The ICO said that while there was a valid purpose for the database, data was shared with third parties without proper protocols and thus in breach of data protection laws. They also found that the gangs matrix failed to distinguish between victims of crime and offenders, leading the Met to target and profile non-violent and innocent victims.
The ICO issued an Enforcement Notice, compelling the Met Police to ensure it complies with data protection laws in the future. As a result of this ruling, the Met Police also removed hundreds of people from their database of alleged gang members.
When the formalised systems are inadequate, civil society organisations often turn to campaigning. Campaigns are a key way to create awareness, engage different communities, and advance data literacy.
In 2016, the Department for Education started collecting school pupils’ nationality and country of birth data as a means of enforcing immigration control under the hostile environment agenda.
This policy was implemented without public announcement through changes to the Early Years Census and School Census, which stores data in the National Pupil Database. This data system only came to light after Defend Digital Me launched a lengthy FOI request.
In the aftermath of these findings, Schools Against Borders for Children (ABC) was established by parents and schoolteachers to fight the policy and was joined by two civil society organisations, Defend Digital Me and Liberty, in a campaign that successfully stopped this data collection.
There is growing recognition in civil society that engaging with the use of data systems will require organisations to collaborate across sectors and build coalitions with those communities most affected as well as the professionals—social workers, teachers, healthcare professionals—who work closely with them.
Under the hostile environment policy, a backroom deal was struck in November 2016 between the UK Home Office and the National Health Service (NHS) that gave the Home Office access to confidential patient information to aid immigration enforcement.
Alongside the ABC campaign, Liberty launched the “Care Don’t Share” campaign in December 2018, calling on public sector workers, unions and members of the public to sign a pledge in support of a data “firewall” between public services and immigration enforcement.
Migrants’ Rights Network, National Education Union, Runnymede Trust, and the Platform for International Cooperation on Undocumented Migrants also joined the campaign. Migrants’ Rights Network and Liberty also together launched a legal challenge against the Home Office.
In May 2018, as a result of the challenge, the Home Office agreed to limit its use of NHS data for deportation cases to those suspected of committing a serious crime. In November 2018, NHS Digital completely withdrew from the arrangement.