Explore & master
   |
EDHEC Vox
 |
Research

Social Media Regulation: Content Control vs. Platform Nudging

Frank Fagan , EDHEC Augmented Law Institute Affiliate Researcher

Many think that social media platforms can regulate themselves effectively. Because political disinformation campaigns carry such a high cost, the idea goes, platforms will begin to respond with…

Reading time :
26 Jun 2018
Share

Many think that social media platforms can regulate themselves effectively. Because political disinformation campaigns carry such a high cost, the idea goes, platforms will begin to respond with tighter controls over content and advertising. If private platform interests are adequately aligned with broader public interests, self-regulation makes sense. It is cheaper, more agile, and less politically messy.  Problems arise when platform and government interests are misaligned, even partly so. Where interests diverge, classical economic theory plainly teaches that self-regulation will not be forthcoming. If the divergent government interests are good for society as a whole, then there is a case for government regulation—so long as its benefits exceed its costs.

 

Regulation is justified by market failure. In the world of social media, there are three relevant players: producers, consumers, and platforms. The producers of content overwhelmingly consist of regular people and advertisers. When they engage in private interactions, they either fully internalize the costs and benefits of those interactions, or generate information that can be shared for the public good. If the advertisers were a collection of chemical factories, for instance, it would be as if they created cleaning products without emitting any pollutants into the atmosphere.  

 

Sometimes, however, private interactions generate negative effects experienced by third parties. People who purchase advertising in order to spread false information drive up the costs of truth-seeking. An entity that purchases advertising both for and against a political issue, often seeks to pollute discourse and generate institutional decay. These types of interactions among producers, consumers, and platforms can generate public bads (as opposed to public goods). When amplified, they can lead to discursive crisis.

 

It is tempting to sound the alarm, blame fake news, and get tough on platforms. On the other hand, we might begin by asking why people seemingly want to consume fake news and gravitate toward sensational media outlets. Why does politics seem increasingly like entertainment and spectacle?  Recent advances in social psychology show that people increasingly view politics as a team sport. It's less about deliberating over the scientific basis for climate change, for instance, and it's more about opposing reification and alienation with "Yay!—team!" and "Boo!—the other guys!"

 

If the social psychologists are correct, then regulation should be less about targeting specific content, and more about configuring the platform to isolate and quarantine the bad actors—and in the process—drive regular people toward platform areas where discursive excellence thrives. For example, Reddit has learned that banning trolls doesn't solve the troll problem. Trolls simply change their usernames and continue trolling. But when "shadowbanned," trolls are blocked from posting without being informed that they have been blocked. Trolls continue to troll, thinking that everyone is listening, but they are trolling to an empty forum, where no attention is being paid. This is just one example of configuring platform architecture to promote discursive excellence.

 

A systemic approach to social media regulation recognizes that people desire fake news and succumb easily to polarization. Media literacy will not meaningfully reduce the demand, nor will heavy policing of ambiguously lawful content. A better way is to reinforce existing platform areas where discursive excellence thrives, and to funnel people to those areas with creative platform nudging.

 

Read the full academic article, published in Volume 16 of the Duke Law & Technology Review