How can citizens work for a world where digital power is guided by evidence and accountable to the public?

Citizen research and action on digital life are more important than ever, since most people no longer trust the tech industry to do the right thing. As communities, researchers, journalists, and advocates build industry-independent projects, how can our work be driven by the public interest?

This November, the Citizens and Technology Lab convened communities, researchers, and advocates to chart new directions for research and action for flourishing digital societies. This post reports our conversations about:

This report only includes names and photos for some of our participants. To protect attendee safety and privacy, we have kept some contributors anonymous or pseudonymous.

Citizen Agendas for Research and Action on Digital Life

At CAT Lab, we believe that society and science both advance through research that’s shaped by affected communities. At our summits, we imagine research together that could make a difference in people’s digital lives.

we believe that society and science advance through research that’s shaped by affected communities

Our first community research summit (Boston, Jan 2018) gathered researchers, and community moderators to develop our research with reddit communities. We also facilitated wider discussions among a network of advocates, educators, and journalists:

  • A reddit community imagined countermeasures to algorithm-driven harassment- which CATLab then tested
  • A community that supports marginalized voices developed research designed to reduce the effects of online attacks on newcomers
  • A PhD student collaborated with communities to design machine learning models for community-led content moderation
  • A team created educational resources for teaching students and communities about algorithmic audits and accountability
  • Researchers and advocates made a plan for studying MA prescription drug surveillance, a team that contributed to changes in MA law

In our second summit this summer, we convened 30 Wikipedia contributors from over 15 nations in Stockholm to set the directions for our collaborations with Wikipedians.

These summits help CAT Lab prototype “citizens agendas” and organize our fundraising priorities while also supporting a wider network of research and action. The Citizens and Technology Summit in NYC was our third event of this kind. Here’s what we learned:


1. Meaningful Social, Political, and Scientific Conversations at Scale

Participants: Julia Kamin, Josh , r/feminism moderator, r/feminism moderator, /u/likeafox

Reddit communities host some of the largest group discussions on the English language internet. 11% of American adults and 22% of adults 18-29 use Reddit. These groups, like parallel groups on Facebook, WhatsApp, and Telegram, are governed and managed by volunteers who moderate some of the most influential conversations online.

because platforms direct massive flows of attention into communities, they can regularly feel like a battleground.

Reddit hosts tens of thousands of active communities on topics ranging from books and personal finance to science, news, porn, and videogames. People sometimes join conversations (and moderate them) when they’re interested in a topic. Many also participate because reddit’s massive scale makes discussions seem consequential. reddit’s users are also more news-focused than those on many other platforms.

As we found in CAT Lab’s research with r/politics, many see large reddit communities as digital territory for advancing ideas and competing with opposing groups. Elsewhere, subreddits for learning and knowledge like r/science, r/iama, and r/askscience work to grow understanding among massive audiences.

Beyond those larger groups, smaller communities like r/feminism host discussions and create a refuge for people seeking encouragement. Because reddit’s algorithms (like Twitter, Instagram, and Facebook) direct massive flows of attention into communities and allow anyone to comment or vote, all of these groups can regularly feel like a battleground. For example, groups sometimes form “brigades” to overwhelm a community by influencing votes, drowning opposing voices, or making people feel unsafe.

Managing Inclusion and Protection

groups of harassers sometimes send threats; others try to mislead readers and undermine understanding

reddit communities regularly struggle with bad faith actors trying to disrupt or derail conversation. In r/feminism, groups of harassers sometimes send threats and dismissive jokes to people sharing personal stories of sexual assault. In knowledge-focused subreddits, some work to mislead readers and undermine understanding. In both kinds of communities, moderators struggle to distinguish bad faith actors from people who care about the community’s goals. We discussed:

  • Using machine learning to identify good-faith commenters, similar to Wikipedia’s Good Faith machine learning classifier
  • Testing ideas to support people who genuinely want to participate in good faith

Empowering and Scaling Behavior Change Online

While most people think of moderation as content removal and bans, moderators have a wider tool-set of interventions. At the summit, we discussed ideas that could influence behavior and support great conversations.

  • Testing and replicating ideas for establishing beneficial social norms
  • Encouraging, rewarding, and nudging commenters to improve conversations
  • When removing an unruly comment or banning an account:
    • explaining the reasons for content removal
    • sharing guidance on how to be an effective and persuasive communicator
    • adjusting the strength and duration of penalties for breaking community rules
  • Further automating moderation of bad faith actors

Moderation Consistency & Resilience

volunteer moderators have little to no training or support, and it’s easy to burn out

Whether a moderation team is small or large, reviewing and intervening in conversations is hard work– and hard to keep consistent. At our summit (and our previous gathering of reddit moderators), participants pointed out that volunteer moderators have little to no training or support. It’s easy for mods to burn out. We discussed ideas to improve the quality of experience for volunteer moderators and communities:

  • Creating and maintaining moderator support networks
  • Creating shared resources on best practices for moderation
  • Developing support on mental health and resilience for moderators
  • Imagining community-customizable trainings that could improve moderator consistency and help them manage the workload

(for more ideas, see “Moderator Support, Burnout, and Platform Accountability” below)


2. Supporting Communities of Color Online & Reducing Racism on reddit

Participants: u/yellowmix, William, Wesley, and Charlie

While black people use other social platforms in similar proportions to the general population, only 4% of black adult Americans use reddit (between 1-2 million) compared to 11% of all US adults. Among people in the US, reddit is overwhelmingly white. It hosts (but quarantines) white nationalist groups. The platform often sends mixed signals on its policies governing racism and hate speech. 

conversations on race can be steered by an overwhelming number of white participants, making it hard to authentically support the needs of people of color

It’s hard to host a safe, undisrupted conversation among black people or discuss race. Why? Anyone on reddit can join any conversation and anyone can upvote and downvote what’s most visible. Public discussion groups have to reckon with vote brigades from racist and white nationalist groups. They also have little support from the platform to protect commenters from hateful and threatening private messages.

Even when not under attack, vote outcomes on topics of race can be steered by the overwhelming number of white participants, making it harder to authentically support the needs of people of color. When you add the common difficulties of structural bias in demographically-imbalanced groups, it’s unsurprising that people of color remain such a small minority on the platform.

Because reddit is so popular and so white, people of color need supportive conversations about race without harassment and derailment from white users. (See Dosono’s research on identity work on reddit among Asian Americans and Pacific Islanders). Yet reddit offers more opportunity for community moderation than Twitter, where harassment and racism are also common. To support meaningful conversations, POC communities have created innovative software and policies. For example, r/blackpeopletwitter verifies skin tone and vets allies for internal community conversations. Other communities protect themselves from brigades and organized harassment through software that auto-bans voters & commenters who frequent communities that organize harassment.

These creative innovations are imperfect and often attract attacks that make the unpaid work of volunteer moderators even more labor intensive and emotionally burdensome. At the Citizens and Technology Summit, we discussed ideas for research and interventions that could break through this complex set of needs and make a difference.

Understanding reddit Behavior by Demographics and Views of Race

To better support the people who participate, communities that host conversations on race need more information about the demographics of who participates, as well as their understanding on issues of race. Moderators are interested to:

  • Survey people who participate to understand who they are and why they participate in discussions about race, even when it exposes them to racism and stories of discrimination, as well as understand why they stopped participating
  • Study the effectiveness of community identity verification programs
  • Research: does the diversity of tech employees have any influence on how a platform supports marginalized communities?

White Users, Racism, & Antiracism Interventions

Communities for people of color on reddit could benefit substantially from efforts to intervene and influence social norms. Because reddit has a large number of white American users, it might be a productive context for training and supporting white users to intervene and support communities of color online. Before they do so, people who wish to be in solidarity with people of color need to develop the capacity to intervene well and also confront their own involvement in systems of race and racism. To make progress in this area, we discussed ideas to:

  • Analyze data already collected by communities about what white redditors think it means to be an ally to black communities online
  • Develop and test new ways to train, support, and organize white reddit users against racism
  • Test interventions that white people can take to
    • intervene on racism
    • curating conversations about whiteness and race
  • Estimate the effectiveness of algorithms such as SaferBot with historical data, studying the impact across the more than 30 communities that have used it

Moderator Support, Burnout, and Platform Accountability

reddit’s business model, like Facebook, Xbox, WhatsApp, Telegram, and Twitter, relies on volunteer labor to carry out moderation– as CAT Lab founder J. Nathan Matias has written about extensively over the years (here) (here). At the summit, participants wanted greater scrutiny of this arrangement and of the ways that reddit as a platform uses (or doesn’t use) its substantial power online.

  • Moderator-facing: (see “Moderation Consistency & Resilience” above)
    • Develop new ways for moderators to share successes and best practices
    • Prototype and test fundraising models that could offer meaningful support to moderators while navigating reddit’s policies about compensation
    • Test ideas for recruiting and training moderators
    • Test ways of organizing moderator time and effort to minimize burnout
  • Platform-facing:
    • Audit the reddit platform’s policies and interventions on hate speech

3. Engaging with the Public on Privacy and Consumer Protection

Participants: Katie, Omar, Bonnie, Katherine

How can the public understand harms related to their data, help build a picture of data misuse, and contribute to change?

Digital consumer protection has always been part of the vision for CAT Lab, which draws inspiration from 19th century citizen science in food safety and the environment, both of which were led by women’s movements. We have also written about algorithmic protection for consumers and citizens and about the obligation to test platform products and policies on our safety and civil liberties. In 2019, supported by a grant from the Ethics and Governance of AI initiative, CAT Lab began a search for potential partners. At the Citizens & Technology Summit, we announced a collaboration with Consumer Reports Digital Lab to test ideas for engaging volunteers in distributed research and testing on digital products and services.

Projects that empower the public to contribute to science and exercise their rights rely on public understanding. At the Citizens and Technology Summit, we brought together colleagues from Consumer Reports with people from coveillance.org, a collective that’s working with the ACLU of Washington to create a “watching the watchers” toolkit for community organizers to build capacity in their communities to resist surveillance.

Understanding Harms and Seeing Infrastructure

While each of us is affected by data privacy risks, dangerous surveillance and data use aren’t as easy to see or understand as furniture that can tip over onto children or bike helmets that protect your head in a crash. At Consumer Reports, the Digital Lab team is working on ways to inform the CR membership about privacy risks that affect them and others– especially to reach people who aren’t already familiar with technology and privacy issues. Coveillance has worked on similar questions in Seattle, which recently adopted regulations governing surveillance technology. Public oversight works best when the public understands the issues and engages with those processes.

At the summit, people at the table:

  • Shared story ideas that would make these issues real to people in vulnerable situations, as well as people who are less vulnerable but might care
  • Brainstormed storytelling formats and genres that can make infrastructure and aggregate risks more immediate to people
  • Mapped out social movements and advocacy groups working on these questions 

Growing Privacy Literacy and the Capacity to Spread It

a broadcast approach to informing people about privacy isn’t enough

Because privacy matters to everyone, because it’s complicated, and because it depends on context, a broadcast approach to informing people about privacy isn’t enough. At the summit, we learned that both coveillance and Consumer Reports have developed trainings and train trainers to teach their own communities about privacy and surveillance questions. Coveillance have published details of their pilot workshops in Seattle.

During the summit, our groups swapped resources and also discussed further ideas:

  • Offering feedback on trainings, including community assessments with key stakeholder groups
  • Making a list of gatherings and festivals to prototype and share training resources
  • Discussing ways to ensure that privacy training materials have clear theories of change and are relatable for people who participate, including:
    • personal decisions about how to use technology systems
    • participation in advocacy and public oversight
    • participation in testing and accountability research
    • training and informing others

Illustrating Data-Related Risks

In the world where we envision plentiful public knowledge about data sharing, privacy risks, and algorithmic bias, how can people link their own experience to these issues? At the summit, participants prototyped ideas for visualizing data-related risks and making them personal to individuals and communities. At CAT Lab, we’re especially excited about projects that illustrate what we know *and* what isn’t yet known– in ways that motivate and encourage the public to contribute to answers.


4. Connecting Homeless People with Support and Resources

Participants: Adam, Aparajita, Bishop, Kay, and Maurice

Homeless and formerly homeless people already organize and support each other through social media and online communities. How can that work be supported, and what new ideas would improve people’s access to support and resources?

This table at the summit included researchers, designers, and homelessness advocates. Lex, and Kay are homelessness advocates and researchers. Bishop is on a team assessing youth homelessness services with New York City’s Youth Homelessness Task Force. Adam is founder of Streetlives, an upcoming platform for community empowerment and self-representation co-designed by people who are homeless or in poverty. Aparajita is a PhD student at Cornell who is studying digital homeless networks. Maurice is a PhD student at Cornell University.

Online Groups and Digital Resources

How do homeless people in New York City use social media now, and what would improve that experience?

social media helps people triage their needs… but we don’t commoditize community support

Social media helps people triage their needs. On Instagram, you can find people that are going to see you and you can get support and affirmation that, yes, there’s a person behind this page that can get me what you need. Facebook can be a way to find a community or group that can provide resources or products or services or a ride somewhere. FB also provides a community for people from across different states who identify and provides support. We don’t commoditize our community support. Someone shares a go-fund-me or someone shares a resource. I welcomed vulnerability and I went online and asked for what I needed.

At the summit, the group discussed:

  • ways to crowdsource information about resources available to homeless youth
  • challenges and risks of compiling information, comparing, and rating services
  • effective ways to moderate online groups to ensure safety and meaningful support

Beyond Services and Resources to Creating Change

While access to resources and services matter, so does the deeper work of systemic change. The table discussed ways that online connections and community could advance that kind of change.

For me, organizing is the most impactful way I make a difference.  How do we represent ourselves on social media? There is a preconception of how people who are homeless express themselves. They are still themselves whatever they experience. It’s important to think about how to give people power to show up however they want to show up.

Restructuring How Researchers and Participants Relate

The group also discussed ways that experts with lived experience of homelessness are sometimes invited to contribute to research without a way for their contributions to be acknowledged. Community partners of university research projects invest their knowledge and time to projects. Then, while they advance the education and careers of university-based team members, they often fail to similarly advance the careers of non-university members.

community experts are sometimes invited to contribute to research without a way for their contributions to be acknowledged

At the summit, we discussed several ways to address these problems, including:

  • producing certificates for community partners that they could include in their resume (CAT Lab started prototyping this idea with Wikipedia participants in 2019)
  • including community partners as co-authors (CAT Lab has several studies underway with community partners as expected co-authors)
  • writing letters of recommendation for community partners (this was a great idea that we hadn’t considered before)
  • integration with projects like Open Badges for recording and certifying informal learning experiences

Acknowledgments

We’re grateful to everyone who participated, to Alyssa Watson who handled event logistics at Cornell, and to the team at Social Media Test Drive, who helped CAT Lab navigate Cornell administration in our first semester. The summit was funded by the MacArthur Foundation and hosted by the Ford Foundation. Thanks everyone!