How can we address online harms such as revenge porn? And how can lessons and procedures from restorative justice apply to online environments?

Speaking today at CAT Lab is Dr. Niloufar Salehi, an Assistant Professor at the School of Information at UC, Berkeley. Her research interests are in social computing, participatory and critical design, human-centered AI, and more broadly, human-computer-interaction (HCI). Her work has been published and received awards in premier venues in HCI including ACM CHI and CSCW. Through building computational social systems in collaboration with existing communities, controlled experiments, and ethnographic fieldwork, her research contributes to the design of alternative social configurations online.

How might Twitter work differently? To imagine “Good twitter,” Niloufar and her collaborators Amy Hasinoff and Anna Gibson are using the practice of speculative design to imagine alternative approaches to managing online harm (you can read more in this Brookings Institute article).

Niloufar tells us about a private Facebook group called Marines United, where people shared dossiers of women’s intimate imagery without their consent. The harm caused by this group stood out because it was a coordinated, large-scale effort, but it’s not an outlier. According to research by Pew, one in five Americans have been targets of severe forms of harassment such as sexual harassment, stalking, revenge porn, physical threats, and sustained harassment over time.

When we start talking about online harm, we often end up talking about free speech and debating gray areas. In this talk, Niloufar is not engaging in that conversation. Revenge porn, says Niloufar, is an area where everyone agrees there is harm, even if we don’t yet know what to do about it.

After the journalistic investigation about Marines United, Facebook shut down the group. But other groups quickly started to replace it. These are private groups, so people within the group won’t report it. This allows perpetrators to fly under the radar of moderation systems. In response to advocates, Facebook promised that they would take an active role in addressing these harms.

A new section of Facebook’s website called “Not Without My Consent” represents the most proactive approach Facebook and others have taken. The service allows individuals to report specific images that have been shared non-consensually, and offers tips on engaging with law enforcement. Niloufar argues that despite the time and money invested in this effort, the approach will ultimately not be enough.

Here’s how Facebook’s system works: people who are worried that their intimate imagery might be shared go to the Facebook website and upload the photo to Facebook, who will prevent the image from being shared. This system has several problems. It assumes that:

  • people have the photo
  • they trust Facebook with extremely sensitive content
  • the software can’t be fooled

Most importantly, this software takes control away from the victim. They never learn if someone tried to upload it, and they aren’t offered any proof that they could take to law enforcement. And Facebook, like other companies, tends to deny requests from lawyers for evidence that could help victims in court.

Niloufar says that companies fundamentally misunderstand revenge porn by imagining it as a problem of individual pieces of bad content. They think about it as something similar to spam or other content that is handled through content moderation, rather than as a problem of social context and relationships between people. It also echoes the common framing that victims of revenge porn did something wrong. 

Niloufar suggests restorative justice as an alternate way to frame the problem. Restorative justice has origins in indigenous groups, and has been taken up by activists and abolitionists as a way to think about redressing harms. The central obligation of restorative justice is to right wrongs by addressing victims’ needs and hold offenders accountable to right those wrongs, in contrast to other approaches that don’t center victims and survivors.

What does restorative justice look like? In participatory sessions, they share scenarios with participants who have experienced revenge porn and who have expertise in restorative justice processes. Together, participants talk about what they might do in these situations. 

Many of the things we discuss are quite labor intensive and expensive, says Niloufar. Restorative justice looks very different from corporate visions of content moderation as scalable work involving underpaid labor. 

How are we going to get to that future? We’re going to need to push companies and organize alternatives, Niloufar tells us. Companies often benefit from the problems of harassment and content moderation due to the increased engagement they cause. Niloufar concludes with a provocation: if companies can’t effectively address these issues, then maybe they shouldn’t exist or have such high profit margins.