Nathan Gardels is the editor-in-chief of Noema Magazine.
“In a murmuration,” Renée DiResta writes in Noema, “each bird sees, on average, the seven birds nearest it and adjusts its own behavior in response. If its nearest neighbors move left, the bird usually moves left. If they move right, the bird usually moves right. The bird does not know the flock’s ultimate destination and can make no radical change to the whole. But each of these birds’ small alterations, when occurring in rapid sequence, shift the course of the whole.”
What more pertinent way is there to describe the viral dynamics of Twitter than compare it to the collective behavior of the birds iconified in its corporate logo? What we can learn from the self-organizing patterns of feathered flocks may help us discover how peer-to-peer social networks might be governed — both politically and through algorithmic design — to amplify their potential as a common space of communication while mitigating the deleterious effects that have so damaged the quality of democratic discourse.
The philosopher Byung-Chul Han has posed the challenge to overcome elsewhere in Noema: “Digital communication redirects the flows of communication. Information is spread without forming a public sphere. It is produced in private spaces and distributed to private spaces. The web does not create a public.”
Using the murmuration metaphor, DiResta explains the negative kinetics that fragment the possibility of framing a common reality: “For humans, signals are passed from screen to screen, news feed to news feed, along an artificial superstructure designed by humans but increasingly mediated by at-times-unpredictable algorithms. It is curation algorithms, for example, that choose what content or users appear in your feed; the algorithm determines the seven birds, and you react.”
She continues: “Twitter’s Trending Topics, for example, will show a nascent ‘trend’ to someone inclined to be interested, sometimes even if the purported trend is, at the time, more of a trickle — fewer than, say, 2,000 tweets. But that act, pushing something into the user’s field of view, has consequences: the Trending Topics feature not only surfaces trends, it shapes them. The provocation goes out to a small subset of people inclined to participate. The user who receives the nudge clicks in, perhaps posts their own take — increasing the post count, signaling to the algorithm that the bait was taken and raising the topic’s profile for their followers. Their post is now curated into their friends’ feeds; they are one of the seven birds their followers see.”
What is often the end result is all too familiar: “Recurring frenzies take shape among particular flocks, driving the participants mad with rage even as very few people outside of the community have any idea that anything has happened.”
DiResta explores whether “content moderation” aimed at separating the “good” from the “bad” can be timely and effective given the distributed character of the information ecosystem in which one fleeting tweet can escape the cage and just spiral off into viral murmuration among millions of the similarly disposed. Musk’s own tweet to his 114 million followers about Nancy Pelosi’s husband, later deleted after the damage was done, comes to mind as one particularly poignant example of this.
Moreover, as the Twitter custodian’s libertarian belief in absolute free speech makes clear, determining how misinformation or disinformation is defined, and by whom, is itself an endlessly polarized debate.
DiResta sees more promise in “rethinking design” that has “the potential to shape propagation through curation, nudges or friction.”
Her specific proposals: “Twitter might choose to eliminate its Trending feature entirely, or in certain geographies during sensitive moments like elections — it might, at a minimum, limit nudges to surfacing actual large-scale or regional trends, not simply small-scale ragebait. Instagram might enact a maximum follower count. Facebook might introduce more friction into its Groups, allowing only a certain number of users to join a specific Group within a given timeframe. These are substance-agnostic and not reactive.”
DiResta acknowledges that such an approach faces the howling headwinds of the social media business model, which favors virality over all else.
A Digital Republic
Jamie Susskind follows the logic of this market dynamic to its culmination in the concentrated ownership of the attention economy and proposes how it might be addressed. “The unaccountable power of digital technology is at its most obvious when a vast social media platform is purchased by one man for expressly political purposes. But the challenge is not limited to Musk or even to social media. Something bigger is going on,” he wrote in Noema earlier this year. “The political implications are clear to anyone who wants to see them: those who own and control the most powerful digital technologies will increasingly write the rules of society itself.” If “the digital is political,” it too needs the kinds of institutional constraints that govern the political systems of open societies. In short, a “digital republic.”
In a digital republic, Susskind argues, “There would be appropriate checks and balances on the exercise of digital power. These might take familiar forms: systems of certification for powerful technologies; professional qualifications and duties for powerful individuals; avenues of appeal against important algorithmic determinations; systems of inspection and oversight for high-risk products and platforms. In other industries, these kinds of measures are commonplace. In tech, they are seen as heretical.”
Putting Community Back Into Communication
Together, what DiResta and Susskind propose at least points the way toward some correctives to a system of communication so far driven by the imperative to capture the attention of the largest market of autonomous like-minded individuals and connect them into tribal silos which confirm their exclusive worldview. As the philosopher Han puts it, digital networks designed in this way amount to “communication without community.”
Democracy in its republican form is fundamentally about how diverse citizens within one community relate to each other through the institutions of self-government. And where there are interpersonal relations, there are ethics of communication by which the trustworthiness and truth content of information constitute the narrative ground upon which civic coexistence is possible.
Like political revolutions, technological revolutions tend to unfold in phases. First comes the liberating breakthrough from the old order, burnished with utopian ideals. Next comes the reaction to abuses that inevitably arise from embarking on a new path for which there are no rules, especially for the first movers who become the new masters. Finally, a new governing order is established that sorts out the mistakes and excesses from the benefits of transformational change and eliminates or tempers the former.
This last phase, it seems — let’s hope — is where we are headed next.