By now it should be evident that Twitter Inc. is a shadow of its former self. What’s not clear is which alternatives might replace it, or how the market will look a few years from now. This chaos offers an opportunity to rethink the entire structure of the social media landscape.
There’s a lot of problems to solve. Disinformation, offensive content, harassment, and doxxing are among issues that platform operators, regulators, and users have struggled with. Each existed before MySpace, Facebook, Twitter and Instagram were born, but digitally connecting friends and strangers has exacerbated them to the point that even the largest companies and most-powerful governments can’t rein it in.
The answer is simple: Break them up.
Not the social-media companies themselves, as has been suggested for the likes of Facebook owner Meta Platforms Inc., but the networks they run. As any parent knows, the easiest way to stop two children fighting is to separate them. This is what needs to happen to limit the virality of disinformation and ensure that, say, neo-Nazis don’t clash with Socialists.
Since this would eat into their revenue, companies won’t make it happen. And even if governments could split Meta into its component parts — Facebook, Instagram, Whatsapp — that’s unlikely to limit the flow of problematic content. Recent attempts to stamp out misinformation internally or in conjunction with external organizations has had only limited effect.
“One of the things that’s not been particularly effective is fact-checking our way out of these messes,” Robert W. Gehl, Ontario Research Chair in Digital Governance for Social Justice at York University in Toronto, told me recently. “Because you throw facts out there all day long, but the conspiracies spread really fast. Misinformation spreads fast.”
Elon Musk’s tumultuous time at Twitter (earlier this month the company moved to tweak Twitter’s algorithm to ensure his posts got more views) highlights the precarious future of the social network, with every new revelation driving fresh interest in substitutes. Mastodon, Post, Koo and Reddit have all gained ground as a result. This is ironic because, except for Post, these other platforms have been around for years and Twitter — with 200 million users — is a minnow compared to giants like Facebook, Instagram and TikTok, with more than 1 billion subscribers each.
That the slow disintegration of a relatively small platform has spurred millions to look elsewhere indicates a nascent need for something different, and presents an opportunity for these lesser-known social networks to gain momentum. Their biggest challenge, though, is facing off against a well-established incumbent and breaking into the network effect that favors large players over the smaller ones.
Post and Koo seem to be replicating the Twitter model — a centralized system viewable and accessible to all. At present, the only thing stopping them from facing the same problems of misinformation and the dissemination of toxic content is their size. Reddit follows an old-school topic-based content distribution model — you’re unlikely to stumble across neo-Nazi posts if you’re only looking for cat videos.
Mastodon is entirely different, and may end up being the model that survives long into the future. Users join a group specific to their interests — called a server — which is moderated by the server owner. Anyone can create one, and set their own rules. If infractions happen, that administrator can boot the offender. Each server can then connect to others through a Fediverse, so users and their content aren’t entirely siloed. But if that server becomes the source of toxic posts, it can be cut off from the rest.
That’s exactly what happened a few years ago, notes Gehl, who was one of the first academics to research Mastodon and later co-authored the book Social Engineering, which charts the history of manipulative communication.
Far-right microblogging site Gab set up a server in 2019, which caused uproar among the rest of the community. Mastodon founder Eugen Rochko argued at the time that the network’s decentralized nature meant he couldn’t acquiesce to demands to shut down Gab. But individual servers could disconnect from the hate group, and that’s what they did. Gab still existed, but in its own little universe separated from the rest of the world.
Unfortunately, this approach doesn’t solve one of the biggest problems: Malicious, false or toxic content. Neo-Nazis can still publish racist posts and make threats against groups or individuals. What the disaggregated structure means, though, is that disinformation cannot go far. That alone, argues Gehl, is enough to mitigate social media’s most toxic feature: virality. He sees parallels with attempts by terrorist groups to move to the dark web — a subset of the world wide web that’s harder to access — but then failing to gain traction.
“There was a lot of panic about five or six years ago about ISIS being on the dark web, and that’s how they’re going to recruit people and spread their message,” Gehl said. “ISIS didn't do that because nobody was on the dark web. The dark web is really small, and a bad organization like ISIS is going to go where people are.”
There’s an unfortunate flipside, though. Keeping internet users within their own siloes of interest means the creation of echo chambers, where conspiracy theories fester and alternative views are never entertained. That means fewer moments of serendipitous discovery. But maybe that’s the price we need to pay to keep social media alive and functional. After all, if the kids can’t play nice together then it may be best they don’t play together all.