Propaganda in the Digital Age

Monday, February 26, 2018

 

 

Media revelations about the alleged use of social media networks by the Russian government for influencing the presidential elections in the United States in 2016 or, more broadly, for disrupting electoral processes in Europe have noticeably shifted the public and academic discourse towards discussing and investigating the “dark side” of the digital diplomacy. The optimism from the early days of the “Arab Spring” about digital platforms empowering the powerless, has given way to the pessimism induced by the proliferation of the echo-chambers of hate and the rise of post-truth politics. It is therefore important to take stock of these developments and ask ourselves what exactly we know about digital propaganda, what we do not know, and what should we know so that we can contain, if not prevent, its disruptive effects?

 

What we know?

 

To begin with, what we positively know by now is that state-sponsored propaganda has exploded with the rise of social media and the numbers are staggering. For example, according to the congressional testimony of Facebook, Google and Twitter representatives, more than 150 million people were likely exposed to the Russian disinformation campaign prior to the 2016 presidential election. To put it into context, only 20.7 million people watched the evening news broadcasts of ABC, CBS, NBC and Fox stations in 2016 (Lang, 2017).  We also know that the classical understanding of propaganda as the “management of collective attitudes by the manipulation of significant symbols” (Lasswell, 1927) still stands. What has changed is the way this manipulation takes place as the digital medium comes with its own intrinsic features. Algorithmic dissemination of content and the circumvention of traditional media filters and opinion-formation gatekeepers, make disinformation spread faster, reach deeper, be more emotionally charged, and most importantly, be more resilient due to the confirmation bias that online echo-chambers enable and reinforce. Finally, we know not only what and how, but also why digital propaganda has become such a phenomenon. From a broader geopolitical perspective, digital propaganda, as the “Gerasimov Doctrine” points out, is an effective non-military means for achieving political and strategic goals, in a way that exceeds the power of force of weapons (cited in MacFarquhar, 2016). In other words, the weaponization of information via digital propaganda has come to be seen by some states as the optimal instrument for correcting power asymmetries in their global standing.

 

What we don’t know?

 

Intriguingly, despite the explosion of channels, botnets and content involved in digital disinformation, what we do not firmly know, is whether digital propaganda is actually successful. Certainly, the promotion of echo-chambers of hate and the online escalation of political polarization are tangible effects of digital propaganda, which cannot be ignored. What is more difficult to assess is the nature of the impact these digital effects has on the opinion and behaviour of the people exposed to them. For example, studies have shown that junk news “outperformed” real news on social media in certain states during the United States presidential elections, and the proportion of professional news content being shared hit its lowest point the day before the election (Howard, Bolsover, Kollanyi, Bradshaw, & Neudert, 2017). These findings show the extent to which voters were exposed to political disinformation, but say little about whether this level of propaganda actually changed the behaviour of the voters in those states. The evidence gap is arguably small, but significant. Similarly, in Europe, there is strong evidence of digital interference by Russia in the political processes of the EU member states (Bjola & Pamment, 2016, East Stratcom Task Force, 2017) . At the same time, the international image of Russia and President Putin has plummeted in the recent years. In Europe, for instance, 78% of people express a lack of confidence in the Russian president (Vice, 2017) The image of the European Union, a frequent target of digital disinformation, is, on the other hand, on a positive trend, after dropping during the financial and the migration crises (European Commission, 2017). While the impact of propaganda is, of course, a notoriously difficult variable to measure, we should be careful not to confine its scope only to the digital sphere. Instead, better methodologies are required to capture how the dissemination of disinformation online generates offline influence in terms of opinion or behavioural change.

 

What we should know?

 

In view of the disruptive effects that digital propaganda has on societal discourse, it is important to know how to (counter)-react to systematic campaigns and efforts that seek to promote disinformation. More specifically, whether defensive measures like the ones currently undertaken by the EU’s East StratCom Task Force are sufficient, or more offensive measures should be necessary and if yes, under what form. Defensive counter-strategies are useful for exposing patterns of digital propaganda, identifying nodes of influence in the disinformation network and improving media literacy about how propaganda works and how not to play its game. At the same time, it has minimal control over the agenda of online conversation as by its very nature it only reacts, in almost a “whack-a-mole” fashion, to the themes and topics of conversation advanced by those promoting digital disinformation. Going on the offence, on the other hand, could help regain some control over the disinformation agenda, minimize the corrosive impact on societal discourse, and make more costly for the other side to engage in disinformation. Specially designed algorithms could be used, for instance, to map disinformation networks (nodes, botnets) and make sure their message dissemination is disrupted both in terms of scope and speed. In addition, a robust strategic narrative should be employed to make the positive case for a particular foreign policy and make more difficult for digital propagandists to respond without contradicting themselves.

 

To conclude, the changing technological landscape ensures that digital propaganda is here to stay. Foreign ministries might not be able to eliminate it completely, but they should be able to reduce its corrosive consequences by better understanding how it works, what type of offline impact it generates and how to combine defensive and offensive counter-strategies in support of their activities.

 

This article was first published in Global Affairs: Vol 3, No 3 (Feb 2018)

 

References:

Bjola, C., & Pamment, J. (2016). "Digital containment: Revisiting containment strategy in the digital age". Global Affairs, 2(2), 131-142.

 

East Stratcom Task Force. (2017). EU versus Disinformation. Retrieved from https://euvsdisinfo.eu/

European Commission. (2017).

 

Eurobarometer: Image of the European Union. Retrieved from http://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/Chart/getChart/chartType/lineChart//themeKy/19/groupKy/102/savFile/850 

 

Lang, M. (2017, Nov 1). Number of Americans exposed to Russian propaganda rises, as tech giants testify. Retrieved from http://www.sfchronicle.com/business/article/Facebook-Google-Twitter-say-150-million-12323900.php 

 

Lasswell, H. D. (1927). The Theory of Political Propaganda. American Political Science Review, 21(03), 627-631.

 

MacFarquhar, N. (2016, Aug 28). A Powerful Russian Weapon: The Spread of False Stories. Retrieved from https://www.nytimes.com/2016/08/29/world/europe/russia-sweden-disinformation.html

 

Philip N. Howard, G. B., Bence Kollanyi, Samantha Bradshaw, Lisa-Maria Neudert. (2017). Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing Over Twitter? Data Memo, Project on Computational Propaganda. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/03/What-Were-Michigan-Voters-Sharing-Over-Twitter-v2.pdf

 

Vice, M. (2017, Aug 16). Publics Worldwide Unfavorable Toward Putin, Russi. Retrieved from http://www.pewglobal.org/2017/08/16/publics-worldwide-unfavorable-toward-putin-russia/

 

Please reload

Prof. Corneliu Bjola

I'm an Oxford scholar seeking to make sense of "unknown unknowns" in international diplomacy, a tech geek constantly on the lookout for the next Cool Thing, and an unrepentant Big Lebowski fan ("lotta ins, lotta outs, lotta what-have-you's..").

Follow Me
  • Twitter Social Icon
  • Pinterest Social Icon
  • LinkedIn Social Icon
Other Posts

In Virality we Trust! The Quest for Authenticity in Digital Diplomacy

October 15, 2019

1/10
Please reload

Search By Tags