Nathan Gardels is the editor-in-chief of Noema Magazine.
The same medium that so effectively transmits a howling message of change also appears to undermine the ability to make it. Social media amplifies the human tendency to bind with one’s own kind. It tends to reduce complex social challenges to mobilizing slogans that reverberate in echo chambers of the like-minded rather than engage in persuasion, dialogue and the reach for consensus. Hate speech and untruths appear alongside good intentions and truths. We’ve seen this both in the Trump campaign in the United States as well as the Brexit campaign in Great Britain.
When the body politic is serially divided among itself, each “tribe” hewing to its own chosen reality, polarization rigidifies. Paralysis and gridlock set in. Simple answers or authoritarian and strongman alternatives start to look like attractive ways to create order out of chaos.
Wael Ghonim, a social activist whose Facebook posts helped ignite what would become the Arab Spring in Egypt in 2011, has experienced this process firsthand. During the Egyptian revolution, Ghonim said he thought that, “all you need is the internet” to set a society free. It turned out otherwise. That revelation has given the man, once branded the face of his country’s revolution by Western media, pause for further reflection. It prompted him to found Parlio, an online social platform to promote civility and dialogue. He is now pursuing the same aim at Quora.
Ghomin spoke to The WorldPost via email about about how to curb the “mobocracy” of social media and make it more of a platform for civil, reasoned reflection that fosters consensus instead of polarization.
At what point in your experience did you realize that the participatory power of social media has a big downside? What event, or series of events, changed your mind?
This wasn’t triggered by a single event. It started to become more clear to me as we were having more serious conflicting views on the way forward for Egypt after the collapse of the [former Egyptian President Hosni] Mubarak’s regime. This wasn’t just about “social media” being mainly responsible for the divide ― to elevate it to that role would be as naive as calling the revolution, “the social media revolution.”
I started to notice that our leaderless movement was turning into a mobocracy. Those with the loudest voices were defining the roadmap. Instead of having a vision for a future that we agreed on, we were all led into a cul-de-sac of unintended consequences by spontaneous and reactive decision-making.
While social media did not create this underlying dynamic, there is no doubt that the algorithmic structure of social media amplified and abetted the turn to mobocracy. The internet has empowered the masses and introduced a more decentralized medium for communicating with each other. But is this so-called “liquid democracy” without any form of meritocracy that sorts out the wheat from the chaff a good thing for society?
In the current U.S. presidential campaign, [GOP nominee] Donald Trump is a living example of the damage the mobocratic algorithms of social media can do to the democratic process. He effectively used social media to bypass both the political establishment and the mainstream press.
Would he have been able to do this 20 years ago? I highly doubt it. The attention he got was mainly due to the power of social networks vis-à-vis the old political parties and the crisis facing the mainstream media who are desperately chasing views regardless of the substance. As Trump put it [in regards to his large Twitter following]:
I love Twitter…. it’s like owning your own newspaper—- without the losses.
— Donald J. Trump (@realDonaldTrump) November 10, 2012
What exactly do you mean by “mobocratic algorithms”?
In social media today, the ultimate prize is getting more eyeballs on posts. For that to happen, the system is designed to reward content that gets the largest number of “likes” and comments.
Why is that a problem? That might work really well when one of us shares a photo of our family, or of a recent adventure, with friends. But when exchanging opinions, content that would draw “likes” or comments is content that confirms people’s biases or, the opposite, that elicits highly passionate and emotional comments against a post, crowding out the less emotive and cooly analytical opinions that would add productively to the conversation.
Can this platform be transformed to enable a refinement of raw emotion, to enlarge the public view through dialogue, negotiation and consensus?
That is the challenge. While once social media was seen as a liberating means to speak truth to power, now the issue is how to speak truth to social media.
‘You are rewarded for broadcasting your opinion much more than engaging in conversations.’
As I’ve said, today’s social media currency is based on the numbers of followers, likes and shares. You are rewarded for broadcasting your opinion much more than engaging in conversations. And the more you appeal to those who agree with you, the more social currency you are going to get.
People will be as shallow as platforms allow them to be. Products are like social movements ― they have implicit cultural norms that people follow. As a product manager I’ve learned through the years that you can design for product experiences that engages people in anyway you want. If you want to see more dialogue, you simply have to create product experiences that reward such a dialogue and make it satisfying to those who are engaged in it. The critical question, then, is, “what is the incentive for social media platforms to do so?”
What responsibility do the Silicon Valley giants who drive social media have to temper the mobocratic algorithm? Do they have any incentive to do so as long as the business model is based on ads chasing eyeballs?
This is one of the tough questions that, for the short term, I have no answer for. But I believe that we are seeing more and more people talk about these issues. This will eventually make creating more meritocratic algorithms that encourage reasoned dialogue part of the bottom line of social media platforms.
For example, for many years Twitter ignored the trolling/harassment problem, and [that] in the past few years backfired a lot on them to the extent that it started hurting their own business bottom line. So now they are working on figuring out how to curb it.
Our duty is to increase the awareness about these issues in public, and as I said earlier, speak truth to social media power.
We’ve seen this in other industries in the past. Take the oil industry, for example. It now spends billions of dollars to be more socially responsible. The good news is that it won’t take us decades before we see “social media responsibility.” It will evolve much faster.
‘People will be as shallow as platforms allow them to be.’
Isn’t the imperative to monetize content by reaching the most eyeballs already leading to a kind of self-censorship ― “That won’t get so many views, let’s not publish it”?
Absolutely. We are trained to adapt to systems. In the present system, why would someone invest in something that would take more time and get less traction?
As you suggest, this continuous effort to grab attention not only affects users, but producers of content [as well]. The primary incentive is to create sensational content that more eyeballs will turn toward. What used to be a small, fun article on the 18th page of the Guardian, could now be the only featured article that people see in their feed from the news organization or on Facebook.
What would an algorithm that promotes truth-seeking dialogue and civility ― a meritocratic instead of mobocratic algorithm ― look like?
Algorithms could start embedding credibility features within their ranking to give more distribution to content that is vetted as truthful or constructive. One of the living proofs of this possibility is the Google page rank algorithm, which does take credibility of a web page into account as one of the many signals for whether a specific page should be featured as the first answer when people search for a particular keyword. One major difference to note here, however, is that the wisdom of the crowds at Google is much easier because the crowd isn’t publicly influenced by others. In social media, people end up seeing what others think before they decide on their reaction to a specific post.
Some have argued that the silo and filter bubble effect is simply a response to the avalanche of information available today. It is a way to organize information overload. As social media and the web mature, do you see a resurgence of “curation” as the next response to overload, in effect, the return of trusted mediators and experts instead of the information anarchy of amateurs?
That’s already happening today in many ways. There is one thing I want to highlight though. The fact that we are living in information overload doesn’t mean that only presenting users with content that they are going to “like” or “comment” on is the right way to go.
In machine learning there is what we call “exploit,” which is when you optimize the algorithm to increase the probability of a user expanding or liking or commenting on a post. But there is also code to “explore” ― when you are showing content where you have no idea whether the user will like it or not, but are trying to find out. Given the economy of the web today, most of the platforms focus too much on “exploiting” and less so on “exploring” and being exposed to all that other information out there that we might not otherwise see.
So, bottom line, you are saying the message can catch up to the medium if we put our minds to it?
Absolutely. If there is will, there is power. We are already seeing the glimmers of it today, and it’s our job to keep pushing the limits and speak the truth to social media platforms.
This interview has been edited for clarity.