Building Collective Power to Refuse Harmful Data SystemsAugust, 2020
What can the public do to change unjust collection and use of data?
In January 2019 when IBM published roughly a million photos of unsuspecting people with the goal of improving facial recognition software, many of those people were surprised and upset. IBM promised a chance to opt out, but many thought that wasn’t enough. After years of advocacy, research, tech worker organizing, and class action lawsuits challenging IBM and other tech firms on the risks and errors of facial recognition, IBM announced in June that it would no longer offer, develop, or research the technology.
The movement against facial recognition is just one of several collective actions that challenges the collection and use of personal data. A previous generation of technologists told people to either consent or opt out. Now people are organizing to change what those systems do and how they work. How can we understand these collective actions as an alternative to individual consent?
To understand this growing approach that we call “collective refusal,” we need to see the limitations of individual consent more clearly.
How The Idea of Consent Divides and Conquers
Although the idea of consent was developed to protect individual rights from powerful groups, it splits decisions about data privacy into an impossible task that no person could reasonably manage. Across corporate policies, academic research, and government regulation, consent shifts responsibility away from powerful data-collectors and onto individual people.
How Consent Doesn’t Protect Human Autonomy
Some critics of individual consent argue that what often passes for consent in practice doesn’t always afford the protections to individual autonomy promised in theory.
Because consent is only meant to help people manage risks to themselves, it cannot prevent risks to society at large.
With individual consent, individuals are expected to make informed, personal decisions about their own welfare. But a single consent decision often affects multiple people. For example, Dr. Amy Hasinoff, who studies how intimate imagery circulates, has pointed out that photographs often include more than one person; the person who takes the photo might not even appear in the image. The person uploading a photo to an online service might be another person altogether. When multiple people have legitimate interests in the circulation of data, no individual-choice consent decision can even record everyone’s interests—especially if they don’t all agree.
Furthermore, many uses of data also affect people across society, not only those directly involved in data collection. A single photo on its own has limited uses. In contrast, large photographic datasets can be used by facial recognition algorithms that are incorporated into abusive policing and immigration systems. Because consent is only meant to help people manage risks to themselves, it cannot prevent risks to society at large.
How Powerful Actors Abuse Consent
Consent is supposed to protect people from abuses of power, but that’s not possible when data-collectors can use their power to influence consent decisions. Data collection often happens within an unequal power relationship in which the data collector is in a position of authority. This unequal power relation is reinforced by information asymmetry between the data collector and participants, who are rarely aware of how their data is used.
The choice to say yes or no to individual cases of data collection doesn’t include power to change broader agendas of data use.
Decisions to collect and data rarely involve the people whose data is sought. For instance, airlines who are replacing boarding passes with facial recognition scanners did not seek the input of international passengers before implementing these systems. When the scanners appeared in airports, they took many people by surprise—but few passengers are empowered to rock the boat in the security line. The choice to say yes or no to individual cases of data collection doesn’t include power to change broader agendas of data use.
How Consent is Useless Over Time
Individual consent assumes that people can make a one-time decision at the point of data collection that forms an agreement about how the data will be used. Yet data collection happens on a continuously ongoing basis online. New developments in data collection, use, and disclosure continue to be invented over time.
Consider the case of IBM’s Diversity in Faces dataset, which investigative journalists pointed out was collected without consent using images from Flickr, a photo site that started in 2004. Even if people had consented, they couldn’t have imagined how their data would be passed around and re-used. Over the next fifteen years, people’s photos on Flickr were acquired by a sequence of four companies and hundreds of academic research teams, even as photos from other sites were also added to the archive. At one end of this chain, IBM researchers downloaded a copy of the dataset from Yahoo in 2019 and modified it to create the Diversity in Faces dataset that attracted multiple class action lawsuits.
No one-time decision could possibly manage this complex web. Even worse, it’s almost impossible to track down who has a copy—especially when companies and individuals download the data, make derivatives, and publish new datasets.
Collective Refusal As An Alternative to Individual Consent
If consent can’t protect people, what else can people who aren’t computer scientists or legal experts do about the ongoing collection and misuse of our data, in situations where we lack power?
Researchers have recently described refusal as a “necessary corollary” to consent. For Ruha Benjamin, refusal is a form of agency that involves “refusing the terms set by those who exercise authority in a given context” and which “may also extend beyond individual modes of opting out to collective forms of conscientious objection.”
refusal is a form of agency that involves “refusing the terms set by those who exercise authority in a given context”
When people in the IBM dataset organize class action lawsuits to challenge IBM’s use of that data, they’re going beyond the constraints of individual consent to challenge the underlying system. That’s a case of collective refusal.
Refusal is broader than merely saying no to individual consent. Instead, it’s a way to think about practical actions that people employ to reject data collection and misuse. While approaches to refusal vary in important ways, successful refusal questions the terms created by data collectors and challenges the structures (like consent) that they use to divide and conquer.
Help Us Identify Examples of Collective Refusal
We think that collective action can help people address the autonomy problem, manage data over time, and replace individual disempowerment with collective power.
To refine this idea of collective refusal, we’re collecting examples of collective refusal in the world. We’re interested in all forms of collective refusal, with a particular focus on refusal of data collection and surveillance by the people whose data is sought.
In many high profile instances of refusal, successful change is driven by multiple approaches to refusal acting in concert from many different groups within an ecosystem. For instance, recent action to refuse the government use of face surveillance technology in policing has involved:
Collective actions by affected communities:
- People whose data was included in a facial recognition dataset refusing IBM’s opt out process, instead filing class action lawsuits against IBM.
- Membership groups like the ACLU and EFF pressing local governments to pass legislation banning the use of face surveillance
- Advocacy groups like the Stop LAPD Spying Coalition suing police to release information about spying programs
Important efforts by experts that we think might be a different kind of refusal from actions that anyone could take:
- Scholars like Joy Buolamwini and Timnit Gebru advocating, testifying, and educating policymakers, practitioners, and the public
- Tech workers pressuring their employers to stop the sale of facial recognition to the government
- Researchers collectively calling against the publication of flawed science used to legitimize face surveillance
Other examples of collective refusal might include:
- Browsing with Tor and other forms of obfuscation
- Intentionally sabotaging crowdsourced data
- Shutting down online communities to force companies to change policies against hate
- Sharing Netflix passwords
- Grassroots internet cooperatives
- “Digital Homesteading” to maintain independent technical infrastructure for your family
Collective refusal is not always about saying no to data collection. After all, activists sometimes use data collection to address gaps in dominant datasets. Further examples include:
- Community-led audit studies that say no to policies that undervalue harassment and make people unsafe
- Websites that bypass publisher paywalls to say no to gatekeeping public knowledge
If you’d like to discuss collective refusal or contribute examples, please reach out at firstname.lastname@example.org.
- Informed Refusal: Toward a Justice-based Bioethics by Ruha Benjamin
- Feminist Data Manifest-No by Cifor, Garcia, Cowan, Rault, Sutherland, Chan, Rode, Hoffmann, Salehi, and Nakamura.
- Non-Participation in Digital Media: Toward a Framework of Mediated Political Action by Casemajor et al.
- Obfuscation: A User’s Guide for Privacy and Protest by Brunton and Nissenbaum
- Privacy Self-Management and the Consent Dilemma by Daniel Solove
- Sexting Panic: Rethinking Criminalization, Privacy, and Consent by Amy Hasinoff
- Resisting Commensurability: Against Informed Consent as an Anthropological Virtue by Kirsten Bell
- Screw Consent: A Better Politics of Sexual Justice by Joseph Fischel
- Commonsense Consent by Roseanna Sommers
We’d like to thank the participants of the Contested Data workshop at Data & Society for conversations that influenced our thinking on this work.