Skip to main content
About HEC About HEC
Summer School Summer School
Faculty & Research Faculty & Research
Master’s programs Master’s programs
Bachelor Programs Bachelor Programs
MBA Programs MBA Programs
PhD Program PhD Program
Executive Education Executive Education
HEC Online HEC Online
About HEC
Overview Overview
Who
We Are
Who
We Are
Égalité des chances Égalité des chances
HEC Talents HEC Talents
International International
Sustainability Sustainability
Diversity
& Inclusion
Diversity
& Inclusion
The HEC
Foundation
The HEC
Foundation
Campus life Campus life
Activity Reports Activity Reports
Summer School
Youth Programs Youth Programs
Summer programs Summer programs
Online Programs Online Programs
Faculty & Research
Overview Overview
Faculty Directory Faculty Directory
Departments Departments
Centers Centers
Chairs Chairs
Grants Grants
Knowledge@HEC Knowledge@HEC
Master’s programs
Master in
Management
Master in
Management
Master's
Programs
Master's
Programs
Double Degree
Programs
Double Degree
Programs
Bachelor
Programs
Bachelor
Programs
Summer
Programs
Summer
Programs
Exchange
students
Exchange
students
Student
Life
Student
Life
Our
Difference
Our
Difference
Bachelor Programs
Overview Overview
Course content Course content
Admissions Admissions
Fees and Financing Fees and Financing
MBA Programs
MBA MBA
Executive MBA Executive MBA
TRIUM EMBA TRIUM EMBA
PhD Program
Overview Overview
HEC Difference HEC Difference
Program details Program details
Research areas Research areas
HEC Community HEC Community
Placement Placement
Job Market Job Market
Admissions Admissions
Financing Financing
FAQ FAQ
Executive Education
Home Home
About us About us
Management topics Management topics
Open Programs Open Programs
Custom Programs Custom Programs
Events/News Events/News
Contacts Contacts
HEC Online
Overview Overview
Executive programs Executive programs
MOOCs MOOCs
Summer Programs Summer Programs
Youth programs Youth programs
Article

Social Media Moderation: Is it Profitable to Fight Fake News?

Strategy
Published on:

Digital content-sharing platforms provide important information to citizens around the world, but they also face enormous challenges, not least those posed by malicious, online information manipulators. Attempts to moderate online content are expensive and are constrained by business models. In addition to investigating the limits of content moderation, this study by HEC Professor of Strategy and Business Policy Olivier Chatain says that government bodies may have a role to play, providing new rules on how to moderate online content while safeguarding freedom of expression.

social_media_on a smart phone - cover

Photo Credits: Metamorworks on Adobe Stock

New forms of conflict, below the threshold of physical violence, are spreading on digital, content-sharing platforms, such as social networks, video sharing sites, and messaging services. At its peak, there were around 200 million monthly engagements with misinformation stories on Facebook1. Many believe this “fake news” has influenced election results, and spread misinformation about Russia’s invasion of Ukraine, for example.

Digital platforms hosting this content are usually created and administered by private organizations, such as Facebook and Instagram, which shape online environments, working under a range of economic incentives and constraints that are rarely explored in security studies. Among the common design choices made by developers of digital platforms is the leveraging of so-called network effects, where the value of being a user is greater the more users there are on the network. Ideally a platform will present a large amount of content that users want to consume, while stimulating them to create new content that others will want to consume.

Unfortunately, network effects make these platforms equally attractive to canny and adaptive information manipulators. Because of this, content-sharing platforms need to perform effective content moderation – the process by which they review and remove content that violates their terms of service. Many moderation activities are labour-intensive, and therefore not easily scalable, while those that are scalable rely on imprecise algorithms.

Business model as an open door

The typical business model for digital platforms involves first growing the user base to maximize network effects. A platform needs to find the right mix of users – both content creators and advertisers. Different types of users pay different prices. Typical users may use the platform for free while those who produce especially attractive content may be subsidized, while users who get the most benefit from the platform –that is, advertisers – are required to pay. In addition to managing network effects and pricing, digital platforms exploit economies of scale through the deployment of labour-saving software.

Looking more closely, we see some common design choices that can work to turn these platforms into arenas of conflict. For example, they often provide highly efficient creation tools and search algorithms. These tools amplify network effects, increasing the chances that certain content will 'go viral', i.e., achieve a high level of awareness due to shares and exposure. This effect is easily exploited by activists advancing fringe agendas, who in turn elicit more hostile reactions from users who disagree.

 

A lack of moderation leads to user polarization, user polarization leads to criticism of moderation by one faction or another, criticism of moderation leads to a reluctance to undertake moderation, and so on.

 

All this puts content-sharing platforms in a difficult position; insubstantial or ineffective moderation can negatively affect users and the wider public, while choices about what to allow, what to remove, what to down-rank or ban, often involve value choices, leaving moderators open to accusations of bias. A vicious circle ensues; lack of moderation leads to user polarization, user polarization leads to criticism of moderation by one faction or another, criticism of moderation leads to a reluctance to undertake moderation, and so on.

Legitimacy and fairness

The sheer volume of content shared on digital platforms makes it difficult for operators to effectively moderate that content, while nefarious actors can quickly adapt their strategies, using tactics such as coordinated inauthentic activity.

Examples of groups that pose threats through the dissemination of manipulated information include home-grown political fringe groups in democratic states, such as climate change sceptics, or the state apparatus itself in authoritarian states. Across borders, intelligence and propaganda arms of various states may undertake such activities against other states.

Recent cited cases of online information manipulation include Russian attempts to destabilize the 2016 US elections, and the mix with fake news in the 'Macron leaks' in 2017, during the French presidential elections. In the context of the Russian-Ukrainian war, disinformation about alleged biological warfare labs in Ukraine have been disseminated on social media by covert networks, presumed to be Russian and Chinese. Anti-vaccination propaganda has also been spread since the beginning of the Covid-19 pandemic.

Is content moderation sustainable for social media’s business? 

Content moderation must fit into a platform's business model. Today, very large portions of the largest platforms’ workforces are engaged in content moderation, and it is not clear that this can be sustained, given existing revenues.

The long-lasting financial struggles at Twitter can be understood as the difficulty of a relatively small platform to remain profitable while providing adequate moderation. The new management by Elon Musk seems to be gambling on drastically reducing all costs, including moderation costs, hoping that revenues will not drop too much as brand advertisers, who do not want to be associated with a no-moderation platform, take their leave. Whether this is possible remains to be seen.

When a platform does decide to go forward with moderation activities, developing the actual procedure can still be quite daunting. First, consensus must be created as to how to treat political or ideological content. There must be clear and established legitimacy behind any guidelines and criteria. In that respect, regulation, when instituted through a democratic process, can be very helpful. And with the European Union's Digital Service Act (DSA) coming into force, regulators in the EU will be empowered to mandate a minimal level of moderation.

The EU Digital Service Act

Ethics is a subjective and relative philosophy. In today's open and free democracies, where one person's truth may be another person's fake news, the key issue is how to ensure that content moderators are not just censors of unpopular views. Any framework needs to be supported by a democratic process to be legitimate. The DSA is very much about the means of moderation that platforms need to provide or invest in – the 'how to moderate', rather than the 'what to moderate', presuming the former will be easier to agree on than the latter.

The DSA, it is hoped, can ensure a minimum level of moderation across all platforms, but it may also serve to entrench the very largest platforms, which are the only ones likely to remain profitable while performing the requisite moderation. Thus, in the future, we may see more concentration at the top; new platforms may need to remain small, following models in which moderation is either not necessary or impossible, to avoid the associated costs.
 

1Fake News Statistics - How Big is the Problem? | Business Tips | JournoLink.

Applications

Information manipulators on digital content-sharing platforms present a real and pressing challenge. Governments, platforms, and any digital businesses need to carefully weigh the potential benefits and drawbacks of different approaches to content moderation, and work together to develop effective strategies – including regulation that addresses these challenges without compromising the free flow of information and ideas.
Based on an interview with Professor Olivier Chatain on his report, “The Business Model of Content-Sharing Platforms and the Supply of Content Moderation, Implications for Combating Information Manipulations.” It was published in October 2022 by IRSEM.

Related content on Strategy

colleagues shaking hands - vignette
Strategy

Consultants, Lawyers, Accountants… What Drives Team Collaboration

By John Mawdsley, Olivier Chatain

Strategy

Is Scientific Discovery Driven by Great Individuals or by Great Teams?

By Denisa Mindruta

Eloic Peyrache - HEC
Eloïc Peyrache
Professor, Dean