2 reasons fake COVID info is hard to fight

Study participants overwhelmingly thought that other people were more vulnerable to misinformation. This phenomenon, known as a "third-person effect" predicts that people perceive media messages as having a greater effect on others than on themselves. (Credit: Getty Images)

A new study highlights two reasons misinformation about COVID-19 is so difficult to tackle on social media.

First, most people consider themselves above average at spotting misinformation. And, second, misinformation often triggers negative emotions that resonate with people.

The findings may help communicators share accurate information more effectively.

“This study gives us more insight into how users respond to misinformation about the pandemic on social media platforms,” says Yang Cheng, an assistant professor of communication at North Carolina State University and first author of the study in Online Information Review. “It also gives us information we can use to share accurate information more effectively.”

For the study, researchers conducted a survey of 1,793 US adults. The survey asked a range of questions designed to address four issues:

  • The extent to which study participants felt COVID misinformation online affected them and others;
  • The extent to which misinformation triggered negative emotions;
  • Their support for government restrictions on social media and misinformation;
  • Their support for media literacy training and other corrective actions.

One of the most powerful findings was that study participants overwhelmingly thought that other people were more vulnerable to misinformation. This phenomenon is known as the “third-person effect,” which predicts that people perceive media messages as having a greater effect on others than on themselves.

“This makes it harder to get people to participate in media literacy education or training efforts, because it suggests that most people think everyone else needs the training more than they do,” Cheng says.

The researchers also found that content containing misinformation was likely to evoke negative emotions such as fear, worry, and disgust. That’s troubling for two reasons.

“First, people are likely to act on content that evokes negative emotions, and that includes sharing information on social media,” Cheng says. “Second, messages that are focused on emotions are more easily transmitted on social media than content that is neutral—such as abstract scientific information.”

However, Cheng also notes that science communicators could make use of this information.

“Since fear, worry, or other negative emotions can facilitate information seeking, or encourage people to avoid specific behaviors during a crisis, communicators may want to consider using these emotional messages to convey accurate information about COVID-19 and public health.”

The researchers also found that the better an individual thought he or she was at detecting misinformation in relation to everyone else, the more likely that individual was to support both government restrictions on misinformation and corrective actions, such as media literacy education. Participants who experienced negative emotions were also more likely to support government restrictions.

Additional coauthors are from South China University of Technology.

Source: NC State