Working Paper: Social Media, Disinformation and Electoral Integrity

A new paper articulates the two distinct problems that threaten our elections Social Media, Disinformation and Electoral Integrity <read> Earlier today I moderated a private discussion of the paper with one of the authors.

Since the 2016 United States (U.S.) presidential election, the issue of social media and disinformation has gained increasing attention as a fundamental threat to the integrity of elections worldwide. Whether by domestic actors, such as candidates and campaigns, or through foreign influence campaigns, the ability of voters to make informed choices based on fair and balanced information has been significantly skewed. This working paper attempts to examine the challenges that this issue poses to electoral integrity and what responses election management bodies (EMBs) and international nongovernmental organizations (INGOs) such as the International Foundation for Electoral Systems (IFES) can take to attempt to mitigate the negative consequences. The solutions presented in this paper aim to assist key stakeholders to meet this emergent and mutable threat…

While many aspects of traditional media and elections literature are pertinent to the topic of disinformation, there are many fundamental differences that make thinking in this field unique and potentially require an altered set of analytical tools to help theoreticians and practitioners more accurately navigate this space...

Experts clearly opine that a clear differentiation should be made between cyber threats and cyber-enabled (technology) information operations. The main relevance lies in the proper allocation of resources for tackling each unique set of problems. This is in terms of human expertise, material resources, the strategies to be implemented,and the specific technologies that need to be developed and deployed. The mistake of putting both under the umbrella of cyber threats has been repeatedly made with obvious consequences…

Agents include independent trolls (“human-controlled accounts performing bot-like activities” or harassing others online),9paid trolls, conspiracy theorists, disinformation websites, partisan media, politicians, foreign governments, influential bloggers, activists or government officials, and ordinary users gathered en masse.10Their intents and targets vary; for example, domestic partisan agents may use disinformation to win campaigns through smear tactics, hostile foreign state or nonstate authoritarian agents may intend to structurally undermine democracy by increasing intolerance and polarization, and disaffected anarchic agents may intend to dismantle state institutions and social order. While many are primarily concerned at the moment with automated/inauthentic means of amplifications, there is a growing need to also start addressing the role played by parties, politicians and hyperpartisan media in creating, disseminating and “endorsing” disinformation and divisive contents…

Humans did not evolve to process information and respond rationally; instead, they usemental shortcuts to simplify decision-making. These heuristics combine with another evolved feature, the need to belong to a group, to create vulnerabilities to the kind of systematic manipulation disinformation campaigns use. Our heuristics and biases dispose us to believe information when it is presented in certain ways and wanting to send the proper in-group signals lead people to spread information even if they don’t necessarily trust it…

People are generally more attracted to news with false information than with true information. In a 2018 study on the spread of news stories on Twitter, the MIT Media Lab found that “falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information.”24The truth took “about six times as long as falsehood to reach 1,500 people,” and, controlling for relevant variables, falsehoods were “70% more likely to be retweeted than the truth…

The paper also provides a very good definition of various types of disinformation, misinformation etc.

Unfortunately, as of yet, there is no list of clear solutions. The paper provides a good survey of what is and has been done, and the potential areas that might provide mitigation.

 

 

FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmail

Leave a Reply

You must be logged in to post a comment.