Everyone seems to agree that social media has become a cesspool, with cancel culture mobs enforcing ideological purity on one side and trolls spreading conspiracy theories on the other.
X and Facebook have been accused of amplifying hate and conflict, and the riots in the UK show how a few social media posts can ignite a cauldron of boiling anger and resentment.
In response, governments around the world are cracking down on free speech. Turkey has banned Instagram, Venezuela has banned X, and the British government has jailed people for inciting violence, and in some cases for simply holding outrageous opinions.
But there are better ways to improve social media than banning accounts and censoring “misinformation.”
The root of the problem isn’t fake news in individual posts, but the way social media algorithms prioritize conflict and highlight the most polarizing content to increase engagement and ad revenue.
“This may sound crazy, but I think the debate about free speech is completely distracting right now,” former Twitter president Jack Dorsey told the Oslo Freedom Forum in June. “I think the real debate should be about free will.”
The Market for Social Media Algorithms
Dorsey argues that black-box social media algorithms are influencing our agency by distorting our reality and hacking our mental space. He believes the solution is to give users more control over the type of content they are served by choosing from a variety of algorithms.
“Give people the power to choose which algorithms they use from trusted parties, and give people the choice to build their own algorithms that they can plug into these networks and see what they want. And they can move it. And give people the choice to actually have a market.”
It’s a simple, yet appealing idea, but there are many hurdles to overcome before mainstream platforms are willing to offer users a choice in algorithms.
Why Social Media Platforms Resist Algorithmic Choices
Princeton computer science professor Arvind Narayanan, who has studied social media algorithms extensively and their impact on society, told Cointelegraph that Dorsey’s idea is great, but unlikely to be realized on a large platform.
“Algorithmic marketplaces are an important intervention. Centralized social media platforms don’t give users enough control over their feeds, and that control is dwindling as recommendation algorithms take on a more central role,” he explains.
“I don’t expect centralized platforms to allow third-party algorithms for the same reasons they don’t provide user control in the first place. That’s why decentralized social media is so important.”
There are some early experiments with decentralized platforms like Farcaster and Nostr, but Twitter spinoff Bluesky is the most advanced and already has this functionality built in. However, so far it has only been used for special content feeds.
Also read
characteristic
Satoshi may have needed a pseudonym, but can we say the same?
characteristic
Aligned Incentives: Accelerating Passive Crypto Adoption
Blue Sky, Algorithm Selection Test
However, William Brady, an assistant professor at Northwestern University, told Cointelegraph that Blue Sky will be testing a new algorithm in the coming months, which will serve as an alternative to the site’s default algorithm.
Research shows that up to 90% of the political content you see online comes from a small group of highly motivated and partisan users. “So one of the key things is to try to reduce some of their influence,” he says.
The “representative diversity algorithm” aims to better represent the most common views rather than the most extreme ones, without making the feed trivial.
“We’re not actually removing strong moral or political opinions because we think that’s important for democracy. But we are removing some of the most toxic content that we know is associated with the most extreme people in that distribution.”
Creating personalized algorithms using AI
Approaching the topic from a different angle, Rick Lamers, a Groq AI researcher and engineer, recently developed an open-source browser extension that works on desktop and mobile. The extension scans and rates posts from people you follow, automatically hiding posts based on content and sentiment.
Lammers told Cointelegraph that he created the service so that he could follow people on X who were following his posts about AI and avoid having to read inflammatory political content.
“I needed an intermediate step of unfollowing and following all content, which resulted in LLM/AI selectively hiding posts based on topic.”
Using large-scale language models (LLMs) to classify social media content opens up exciting possibilities for designing personalized algorithms without requiring consent from social platforms.
But we’re not there yet, because reorganizing content in the feed is a much bigger challenge than simply hiding posts, according to Ramers.
How Social Media Algorithms Amplify Conflict
When social media first started in the early 2000s, content was presented in chronological order. But in 2011, Facebook’s News Feed began selecting which “Top Stories” to show users.
Twitter followed suit with its “While You Were Away” feature in 2015, and then switched to an algorithmic timeline in 2016. The world as we knew it was over.
Everyone claims to hate social media algorithms, but in reality, they are incredibly useful in helping users find the most interesting and engaging posts among the vast amount of content.
Dan Romero, founder of decentralized platform Farcaster, points to a thread he wrote on Cointelegraph about this topic. He says that all world-class consumer apps use machine learning-based feeds because that’s what users want.
“This is overwhelmingly what consumers prefer in terms of time spent,” he said.
Unfortunately, algorithms quickly figured out that the content people were most interested in included conflict, hatred, political polarization, conspiracy theories, anger, and public outcry.
“When you open your feed, you’re bombarded with the same content,” says Dave Katudal, co-founder of SocialFi platform Lyvely.
“I don’t want to be bombed by Yemen, Iran, Gaza, Israel, and all of those things. (…) They are clearly trying to foment some kind of political, destructive conflict. They want conflict.”
Research shows that algorithms consistently amplify moral, emotional, and group-based content, which Brady describes as an evolutionary adaptation.
“We have a bias toward paying attention to this type of content because in small group settings, it actually gives us an advantage,” he says. “Paying attention to emotional content in our environment helps us survive physical and social threats.”
Also read
characteristic
How to Protect Your Cryptocurrency in a Volatile Market: Bitcoin OGs and Experts Say
characteristic
Bitcoin 2023 in Miami will cover ‘shitcoins about Bitcoin’
Social media bubbles work differently
The old notion of a social media bubble where users only get content they consent to is not actually accurate.
While bubbles exist, research shows that users are exposed to more opinions and ideas than they realize. Hate More than ever, that’s because they’re more likely to engage in content that provokes anger by getting involved in arguments, dunking on quote tweets, or piling on.
The content you dislike is like a rip-off. The more you fight against it, the more the algorithm gives you. But it still reinforces people’s beliefs and darkest fears by highlighting the worst views of the “other side.”
Like the tobacco companies in the 1970s, platforms are well aware that their focus on engagement is damaging to individuals and society, but the stakes seem too high to change course.
Meta made $38.32 billion in ad revenue last quarter (98% of its total revenue). Susan Lee, Meta’s chief financial officer, attributes much of that to AI-based ad placement. Facebook has been testing a “bridging algorithm” aimed at bringing people together rather than dividing them, but has decided not to put it into production.
Blue Sky, Nostr, and Pacaster: The Marketplace of Algorithms
Dorsey also realized that he couldn’t make meaningful changes to Twitter, so he created Bluesky in an attempt to build an open-source, decentralized alternative. But he became disillusioned with Bluesky making many of the same mistakes as Twitter, and now supports the Bitcoin-friendly Nostr.
A decentralized network allows users to choose which clients and relays to use. Potentially provided Users can choose from a variety of algorithms.
But one big problem with decentralized platforms is that building the right algorithms is a huge undertaking that is beyond the capabilities of the community.
The developer team built a decentralized feed marketplace for Farcaster for the Paradigm hackathon last October, but no one showed interest.
According to Romero, the reason is that community-built feeds are “unlikely to be performant and cost-effective enough for a modern, large-scale consumer UX.” An open-source, self-hosted type client might work.
He said in another thread, “Creating good machine learning feeds is hard, and making them performant and real-time requires significant resources.”
“If you want to build a feed marketplace with a good UX, you’ll likely need to build a backend where developers upload models and clients run the models on (the infrastructure). There are obviously privacy concerns here, but it may be possible.”
But the bigger problem is that “it’s still unclear whether consumers are willing to pay for algorithms.”
Subscribe
The most interesting articles on blockchain, delivered once a week.
Andrew Fenton
Andrew Fenton, based in Melbourne, is a journalist and editor covering cryptocurrencies and blockchain. He has worked as a national entertainment writer for News Corp Australia, a film journalist for SA Weekend, and The Melbourne Weekly.
Follow the author @Andrew Fenton