Misinformation is a problem for political extremes, more so for conservatives

The COVID-19 pandemic created ideological divisions within society about everything from virus origins to vaccination efforts, largely whipped up by ubiquitous misinformation. Our study of Twitter conversations about the pandemic explores partisan asymmetries in the exposure to misinformation.
Published in Social Sciences
Misinformation is a problem for political extremes, more so for conservatives
Like

On January 20, 2020 the first recorded case of the COVID-19 virus in the United States was reported. What has continued to transpire next has been nothing short of unprecedented. While early discourse on social media platforms reflected a population that was coming to terms with the new reality, discussions quickly became polarized. Everything from the origins of the virus to the safety of vaccines divided public opinion along partisan lines. Science had been politicized in the midst of a raging pandemic. 

Concerns surrounding social media’s modern day standing as a reliable news source became more apparent with the proliferation of misinformation hindering the adoption of prophylactic measures and mitigation efforts. While previous survey-based studies have identified a link between political conservatism and misinformation, the interplay between exposure to information and what people shared online was not clear. Understanding this relationship can help identify vulnerabilities and help policy makers  better estimate the impact of misinformation beyond mere sharing, adjust their messaging and improve awareness. 

While the politicization of science is nothing new, the harms have never been this pervasive. Survey-based experiments focus on individual psycho-social characteristics but do not account for the influence of interpersonal relationships. Individuals' perception and formation of attitudes are largely determined by their social circles. People increasingly seek to surround themselves with those who share their ideologies, locking themselves in echo chambers. While echo chambers have been studied extensively in politics, their role in exposing people to misinformation remains underexplored. 

Twitter is one social media platform that allows researchers to analyze social structures and relationships. We leverage a publicly available Twitter dataset focusing on COVID-19 discourse in the United States between January 21, 2020 to July 31, 2020. We leverage URL sharing behaviors of users to quantify the partisanship and factuality of content they share and are exposed to. 

Existing understanding of exposures relies on the follower graph, with individuals being exposed to content from the accounts they follow, who we refer to as their friends. However, by recommending content, personalization algorithms expand the set of possible exposures beyond accounts followed. Ignoring exposures from friends of friends or other recommended content risks underestimating the echo chamber effect. As an alternative, we use the retweet interactions between users as evidence of exposure. Retweeting is a form of re-posting content that appears on a user’s timeline either through friends or recommendations. Prior to retweeting a message, the user is certainly exposed to it.

Partisan Asymmetries in Exposure to Misinformation

Figure 1(a): Exposure within political echo chambers.
Color indicates median factual exposure in each bin.
Figure 1(b): Exposure within factual echo chambers.
Color indicates median political exposure in each bin

We visualize the information users see and share along political and factual dimensions in Figure 1(a) and (b) respectively. Each point corresponds to a user in our sample. In Figure 1a, the position of the point along the x-axis gives the political score of information the user shares, while the position along the y-axis, gives the political score of information the user sees. The color of the point corresponds to the factuality of information the user sees: the redder the point, the more factual information the user sees, but the greener it is, the more misinformation the user sees. We see evidence for echo chambers in both dimensions, characterized by high density diagonals. Similar to previous studies we also find a correlation between political conservatism and propensity to share more misinformation. However, we also find that the relationship is more diffuse and asymmetric: liberals exposed to hardline liberal content see more misinformation (green stripe bottom left in Figure 1a). As liberals become more exposed to conservative content they see more misinformation. The same is not true of conservatives: conservatives who are exposed to conservative content tend to see more misinformation. Moderately conservative users exposed to liberal content receive more factual information. Unlike liberals, exposure to politically moderate content does not promote factual information among conservatives. Similar asymmetries are seen in factual echo chambers as well (Figure 1b). Users who share misinformation and are exposed to misinformation tend to see more conservative content. Among people sharing factual information, those who are exposed to more factual information tend to see politically moderate content (shown in white). (The boxy outline is an artifact of the discretized scores.)

Hardline partisans amplify misinformation

Figure 2: Relationship between amplified partisanship and misinformation. Individuals who share more hardline information than what they are exposed to also tend to share more misinformation than they are exposed to.

Given that hardline conservatives and liberals are exposed to most misinformation, we wanted to assess how this affects what they share. To this end, we quantify how much more factual content each user shares compared to what they have been exposed to (shown on the y-axis in Figure 2). Additionally, we also look at how much more hardline the information the user shares with respect to their exposures (shown on the x-axis). The strong negative correlation highlights that not only do hardline partisan have a higher propensity to generate misinformation but users who amplify politically polarized content also amplify misinformation. With the color showing the individual’s political partisanship, we see that hardline liberals and conservatives both amplify misinformation and partisan content.  

Our work provides a better understanding of the relationship between what people see online and what they share. Identifying hidden asymmetries highlights nuances overlooked in previous studies. Gaining a better understanding of social structures helps policy makers develop better policy and public health experts craft more effective messaging strategies. 



Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Society
Humanities and Social Sciences > Society

Related Collections

With collections, you can get published faster and increase your visibility.

Retinal imaging and diagnostics

This Collection invites works providing insight into using novel or existing retinal imaging technologies in clinical applications or presents new or adaptive forms of these techniques.

Publishing Model: Open Access

Deadline: Apr 30, 2024