Mastodon Isn’t Just A Replacement For Twitter

Users flocking to the platform will need to shift their expectations for social media and become engaged democratic citizens in the life of their networks.

Peter Tarka
Credits

Nathan Schneider is an assistant professor of media studies at the University of Colorado Boulder.

Amy Hasinoff is as associate professor in the Department of Communication at the University of Colorado Denver.

They recently published “From Scalability to Subsidiarity in Addressing Online Harm” in the journal Social Media + Society.

During Elon Musk’s chaotic first weeks in charge of Twitter, some people have been fleeing to Mastodon in search of a better place to lurk and post. This open-source, Twitter-like platform is a part of a system of noncorporate social media known as the fediverse, along with software like the video-sharing platform PeerTube and the file-sharing platform Nextcloud. Over at Social.coop, the cooperatively governed Mastodon community that Nathan co-founded in 2017, new user registrations have swelled from a trickle to a torrent.

The age of Big Social may be ending, as advertisers shift to platforms like TikTok and streaming video that are more like entertainment channels. For many reasons, we say: good riddance. The damage commercial social media has done to politics, relationships and the fabric of society needs undoing. As media scholar Victor Pickard suggests, “Hopefully Twitter’s collapse will lead to a more expansive conversation about the relationships between capitalist imperatives and the communication [and] information needs of democratic societies.”

As users begin migrating to the noncommercial fediverse, they need to reconsider their expectations for social media — and bring them in line with what we expect from other arenas of social life. We need to learn how to become more like engaged democratic citizens in the life of our networks.

Among dominant social networks, the guiding approach to governance has been what the anthropologist Anna Lowenhaupt Tsing calls “scalability.” This doesn’t just mean large scale. It means, according to Tsing, “the ability to expand — and expand and expand — without rethinking basic elements.” It means exponential growth while retaining a one-size-fits-all approach to dealing with problems, and it’s what venture capitalists look for in their investments.

Scalability explains a lot of what seems wrong with social media. Content moderation at scale needs to be semi-automated, which often means applying universal rules without context or nuance. And when abuse, harassment and misinformation drive engagement, the incentive is to address it in a way that doesn’t threaten business. Lacking local knowledge in their users’ languages and cultures, platform companies have aided political interference and even genocide. All these problems have led Meta’s Mark Zuckerberg and former Twitter CEO Jack Dorsey to try to outsource moderation decisions, creating independent organizations for “oversight” and “decentralization.” As Musk’s acquisition of Twitter loomed, Dorsey tweeted about the platform he co-founded: “The biggest issue and my biggest regret is that it became a company.” Even he could see that the business that helped make him wealthy had taken on too much.

“The damage commercial social media has done to politics, relationships and the fabric of society needs undoing.”

As users migrate to the fediverse, they often bring the old expectations of scalable social media. We are used to someone else being in charge and taking care of the problems that might arise, and we are used to complaining when they mess up.

Users also bring the same racism, sexism and bad behavior to Mastodon that have arisen in other kinds of online spaces. But in the fediverse, we cannot simply rely on a company’s trust and safety department to take care of problems for us. The challenge and the opportunity of spaces like the fediverse is that it is up to us which rules we want to follow and how we make rules for ourselves.

Commercial social media give community moderators in spaces like Facebook Groups or subreddits some tools to address problems, and many of those moderators try to involve their community members in decision-making. Ultimately, however, these commercial platforms limit the scope of community self-governance to suit their interests. The fediverse opens new doors. It allows us the possibility to collectively own and more fully self-govern the online communities we participate in.

But how can genuine community self-governance work at the scale of a global social network? We believe that it is time to embrace the old idea of subsidiarity, which dates back to early Calvinist theology and Catholic social teaching. The European Union’s founding documents use the term, too. It means that in a large and interconnected system, people in a local community should have the power to address their own problems. Some decisions are made at higher levels, but only when necessary. Subsidiarity is about achieving the right balance between local units and the larger systems.

Subsidiarity may or may not be a familiar word, but the practice of it is everywhere. In any given town, we take it for granted that its libraries, schools and roads should be largely under local control but still must meet state or federal standards. The common-law court system empowers local officials to interpret laws according to local norms, except when they contradict decisions at a higher level. In contrast to the way the rest of our daily lives are organized, the scalability of social media governance is actually very unusual.

The fediverse is designed for subsidiarity. People cannot simply “join Mastodon.” Instead, users join a particular server that runs Mastodon software, and that server can be moderated and operated in a wide variety of ways. Rather than Twitter’s one-size-fits-all public square, Mastodon is designed for smaller communities that interact with each other. One server might host hundreds of accounts or thousands, while another server might be for only a single user. Just as a user with an email address at gmail.com can easily communicate with another person who uses protonmail.com, users on different servers can talk to each other and appear on one another’s feeds.

“The challenge and the opportunity of spaces like the fediverse is that it is up to us which rules we want to follow and how we make rules for ourselves.”

The fediverse allows users on each server to create their own codes of conduct and other rules. Social.coop, for instance, is a Mastodon server with a robust code of conduct meant to foster a healthy space for people most likely to experience marginalization elsewhere.

On Social.coop, we don’t just post and comment about what’s on our minds; we also decide on our moderation practices and enact them through committees. The Community Working Group handles conflict resolution through accountability processes. Its members are paid with funds from our sliding-scale member dues. The Tech Working Group maintains our servers, while the Finance Working Group keeps an eye on our budget. Any member can propose new activities and policies, and we can all vote on them according to the bylaws. We adjust Mastodon’s moderation settings as we see fit.

Anyone, with any set of values, can do the same with Mastodon software. Both Donald Trump’s Truth Social and the white supremacist-harboring network Gab use this some of the same underlying software as well. Just like on the rest of the internet, anyone, from violent extremists to people with uncommon hobbies, can use the available tools to create siloed spaces. The difference with the fediverse is that it facilitates a structure of relationships between communities.

The subsidiarity of the fediverse enables each server to restrict connections with other servers. This means that moderation on the fediverse occurs not through top-down decisions and universal rules but through bottom-up coordination among people maintaining servers. For instance, a number of servers organized to collectively ban those that harbored white supremacists, like Gab, from the rest of the fediverse — even if it remained active on the network, most people using Mastodon would never see Gab users’ posts. At a time of widespread distrust of big tech companies and their moderation decisions, these kinds of democratic, collective moderation decisions could help strike a balance between free speech and safety.

But the fediverse is not a utopia — it’s just software. Though it facilitates community self-governance, it does not guarantee it. Most of the people entering the fediverse right now are flocking to a small number of popular servers. In effect, they are repeating the logic of scalability, except this time without a company in charge able to spend millions of dollars on large-scale moderation. Currently, many servers appear to be run top-down by people who have the technical skills to set them up, but not necessarily with the social and economic capacity to foster and sustain community self-governance and address online harm.

Dealing with harm and conflict effectively within our own online communities will require a shift in mindset. We are used to outsourcing problems to large platform companies, or when violence happens offline, to police and criminal legal systems.

“In a large and interconnected system, people in a local community should have the power to address their own problems.”

Instead, we can build community-based systems for addressing harm. Along with a growing number of scholars and journalists, in our research we have turned to restorative and transformative justice movements for guidance. These movements try to approach problems through processes of community accountability that help people understand, prevent and repair harm. Many of the people leading these movements are women, non-binary and trans people of color who have experienced the failures of top-down systems of policing and incarceration acutely. These activists have produced training manuals and organizations that could help guide us in how to replace the scalable approach to content moderation with something more local, grassroots and participatory.

When it comes to online harassment, the main goal of scalable content moderation is to identify content that violates the rules and remove or demote it, sometimes also banning the user who posted it. In contrast, a community accountability process would focus on identifying and meeting the needs of the person who was harassed.

Members of the community would support that person and, if needed, work together to clarify and reaffirm the values that were violated in the incident. If those who caused harm are willing to participate, the goal would be to help them understand and take accountability for it. Many studies of restorative justice offline demonstrate that this can produce greater healing than simply punishing the person who caused harm.

What makes processes like this work is a sense of shared values and buy-in among community members. At Social.coop, we intentionally began by creating our server as a “commons.” This means that no one person owns our community; we started as a fiscally sponsored, collectively funded project through Open Collective. Because Mastodon is designed more for chatter than governance, we use a separate platform, Loomio, for our deliberation and decision-making. No single person has the power to pull the plug or to change our rules. This also means that we all have the responsibility to contribute financially and to participate in our self-governance. No investors or advertisers are taking care of our bills for us.

For this kind of community-governed social media to spread, it needs to be easier than it was for us. Social.coop was founded by veterans of cooperative business who have been willing to do things the hard way to do it right. But for mass adoption of online democracy based on subsidiarity, collective governance needs to be easier.

For instance, Mastodon itself might someday include more tools for self-governance and decision-making, so communities don’t have to go somewhere else to find those things. Perhaps services like managed server hosting (so that tech skills aren’t needed) and fiscal sponsorship (so that no one person owns the community) could be packaged along with templates for bylaws and codes of conduct. It should be as easy to create a community in the fediverse as it is to start one on Discord, Slack or Facebook — but that is far from the case now. Participating in community self-governance in our online spaces should become as routine and familiar as upvoting a friend’s post, voting for a city council representative and perhaps even occasionally showing up for jury duty.

“Dealing with harm and conflict effectively within our own online communities will require a shift in mindset.”

If we really want an alternative to Big Social, we need to find ways to invest in it. Twitter has thousands of employees and was recently purchased for $44 billion, while Mastodon has largely been the work of one developer and a community of volunteers. Public investment, philanthropy and forms of private investment in free, open-source software could help platforms like Mastodon counter the monopolistic incentives of venture capital.

If the venture capital model were unleashed on the fediverse, the democratic potential of software like Mastodon would likely be lost. To prevent that, funds could be established to provide loans to help new fediverse communities get going, which they could then pay back with member dues. As new communities develop, they should also make contributions to the development of Mastodon or whatever software they use, perhaps with a platform like Open Collective.

As the fediverse experiences growing pains, we must not judge its progress by the standards of much better-financed technologies. The response to its problems should not be to fall back on models of scalability through consolidation and corporate power. If the fediverse starts feeling chaotic, people might be tempted to flee to a new monopoly that promises to fix it for them. Instead, we should address problems by supporting community governance and subsidiarity.

The fediverse is a different kind of social network, and users need to organize differently on it. As a recent Electronic Frontier Foundation post explains, “Open, decentralized systems offer new choices towards a better online world, but it’s up to us to make those choices.”

Learning how to self-govern on social media will take time. It will present new kinds of challenges and new crises. But when the alternative is top-down control by unaccountable corporations and capricious billionaires, we have to try.