The recent takeover of Twitter by Elon Musk has left many Twitter users looking for a new online home. In the wake of ethical concerns and growing uncertainty about the stability of Twitter, many in the tech industry, academia, journalism, and beyond are turning to Mastodon to provide that home. In addition to creating accounts, many are also starting, or hoping to start, Mastodon servers to build communities of like-minded others. While starting a Mastodon server may be appealing, there are barriers to starting servers that could impact who creates and participates in these spaces, and challenges to operating them. Unless it’s done right, issues common on large social media platforms, such as racism, harassment and abuse, mis- and disinformation, and so on will be replicated on Mastodon, even if server hosts want the exact opposite. By asking “what barriers exist to creating healthier online spaces on Mastodon?” we can begin to imagine remedies that support the development of safe, secure, and inclusive experiences on Mastodon.

Technical barriers

The most common way to start an online community is to work within an existing platform. Starting a Facebook group, subreddit, Discord server, or Slack workspace can often be done with the click of (several) buttons. The platform then supports and maintains the infrastructure. This is not the case with Mastodon.

  • Technical capacity: Starting a server requires technical expertise. While Mastodon provides an “ingredient list” and instructions, these may feel overwhelming to people without networking, engineering, or programming skills. While Mastodon’s federated model supports highly individualized communities, the high technical wall may prevent people historically underrepresented in computing from creating servers that meet their needs. Services have popped up to create and support Mastodon servers, but those are more expensive than independently run setups and may not be reliable in the long run.   
  • Computing and financial resources: Running a server also requires computing and financial resources, both in start up costs as well as maintenance. Further, while startup may be a one-time cost, it’s difficult to predict the cost of maintenance, as that will vary depending on how many people join the server and how active the server is. Groups that more frequently experience poverty (often due to racist, transphobic, or patriarchal systems) may be less likely to be able to afford creating and maintaining a server, particularly an active server and have to rely on donations from users (who themselves may experience poverty). 
  • Privacy and Security: Administrators will be responsible for ensuring the security of their users’ data, particularly as encryption is not built into the system. Further, users’ privacy assumptions may not match reality (e.g., an expectation that DMs are encrypted and unreadable by administrators). Issues with computing resources, scaling, and/or bad actors could make commonly avoided security issues, such as DDoS attacks more common and more impactful.

Barriers to maintaining a server

The prevailing paradigm for online social interactions is the platform. And many people migrating to Mastodon (including those creating servers) continue to operate under the platform paradigm; however, Mastodon is a FOSS, not a platform—in essence, each server is a platform, which complicates how they are operated.

Mastodon is a FOSS, not a platform—in essence, each server is a platform, which complicates how they are operated

  • Compliance: While some moderation-related regulations are unlikely to affect server administrators (for example, NetzDG only applies to platforms with more than 2 million users), others, such as GDPR, DMCA and Code § 2258A (which requires platform operators to remove and report CSAM) may. As with larger platforms, Mastodon system administrators will need to develop ways to respond to and report illegal content on their servers (see, for example, this guide to responding to DMCA takedowns and this discussion on Mastodon and GDPR compliance). 
  • Responsibility: Fines or lawsuits levied against the server for compliance violations or security breaches, may place individuals rather than corporations at risk, which may discourage those without the resources to afford legal counsel, or those who have adversarial relationships with law enforcement, from creating servers. 
  • Local policy development and enforcement: Effective content moderation is a key element for the success of all platforms and communities, large and small. Developing and enforcing effective rules requires expertise that many Mastodon administrators may not have. Enforcement, particularly at scale, requires substantial time, labour, and tooling.

Barriers to inclusion

Each of the above listed barriers can exacerbate exclusion, particularly among groups that have been historically marginalized in tech and online spaces. In addition to these, there are several other barriers to inclusion on Mastodon. Many of these have been identified by Black Mastodon users, and are likely to impact other groups.

  • Missing affordances: Twitter threads by Dr. Jonathan Flowers and Dr. Chanda Prescod-Weinstein have highlighted the importance of several affordances of Twitter that are unavailable on Mastodon. Two examples include the lack of quote-Tweeting and insufficient protection through blocking. As Dr. Flowers notes, quote retweeting supports call and response engagement, in which racism can be called out and community built through responses. While Mastodon also supports blocking, blocking only prevents the person who blocked the content from seeing content from the blocked person. The content, which in the case of Dr. Prescod-Weinstein, contained anti-Black racism and antisemitism, remains visible to others.  
  • Prevalence of white technoculture: Although the mass migration to Mastodon is recent, it was released in 2016 and has had a small, yet active user base with developed norms. Each instance is different and Mastodon itself has been shaped by queer activists from the early days of the site. However, users have reported experiencing microagressions, ableism and mansplaining from seemingly well-meaning users, in addition to harassment and abuse—issues also common in other spaces predominantly used by white, techy, men. In some cases, established norms, such as the regular use of content warnings that automatically collapse sensitive content, can provide a more inclusive environment, such as for those who have experienced trauma. However, how and when they are used and when they are called for by others may also cause harm. For example, Mastodon user Mekka describes how asking for content warnings when describing racism obfuscates discussions of racism behind a filter, preventing the white majority on Mastodon from having to confront it. 
  • Leveraging blocks to silence users: Server administrators have the ability to block entire servers, a control not dissimilar to platforms such as Reddit, Facebook, Discord, which regularly ban communities. However, on Mastodon, server bans prevent their users from being able to see or access content from users on the server (unlike Reddit, in which a participant of a banned subreddit can still participate on other subreddits). This is a powerful control to block content from users of servers that operate to perpetuate hate. However, it may also be used to silence individuals speaking truth to power. Further, which servers a server has blocked may not be visible to its users, and users may not know that they are on a server that has been blocked, thus prohibiting them from building community across the fediverse (an ability that has been important for building connections on black, trans, and disability twitter).

Creating Welcoming Spaces on Mastodon

Surmounting these barriers will take financial support, knowledge, and learning. Fortunately, there are people with knowledge that can be tapped, such as experienced administrators of successful servers, policy experts, and community moderators, and tactics, such as controlling growth, that can help. 

People facing technical and maintenance barriers may benefit from learning from experienced instance hosts who have the technical expertise to create and maintain servers and who already work with existing moderation tools. Some administrators have created public guides to help people making the move from Twitter to Mastodon understand the differences between how Mastodon works and its culture, such as this one, by Mastodon user, Nikodemus (thanks to jessamyn@glammr.us for sharing it!). They may also benefit from the knowledge held by legal and policy experts, who could provide broad guidance on compliance procedures.  

People facing inclusion barriers can learn from experienced moderators across platforms, including long-time instance hosts (e.g., social.coop). For example, community moderators have expertise developing rules, navigating gray areas, and scaffolding successful community engagement. Community moderators also have experience working within hostile environments to create welcoming spaces. While server hosts have limited control over Mastodon’s affordances, community moderators also have experience overcoming technological design that works against their communities. For example, moderators of AAPI subreddits  engage in decolonization practices, such as network building, in response to threats such as brigading; r/AskHistorians moderators use active moderation to subvert Reddit’s upvoting system to support knowledge sharing and learning; and Black moderators have built and maintained successful communities for Black Redditors for years on a platform that did not have a policy prohibiting hate speech until 2020. However, it should be noted that this comes at a considerable effort and risk—moderators in each of these contexts reported experiencing pushback, harassment, and abuse in response to pushing back against prevailing design and culture. 

Finally, it’s okay to grow slowly. Developing policies and practices that work for each community, learning how to effectively enforce policies, and determining what kind of labour is required to enforce them takes time, particularly as server hosts will also need to manage technical capacity as well. Waitlists, like those used by Project mushroom can help control growth so that it happens at a manageable pace. 

Mastodon is in its first wave of Eternal September, in which new users overwhelm existing norms and culture. As Twitter gets less safe to use due to security and content moderation issues, it may experience more growth. Managing waves of growth is possible, even when there are significant barriers, if people are willing to provide the requisite supports and open to learning.