There’s a powerful correlation between the extent to which users trust Facebook, and the intensity of their Facebook use, according to a new study.
The study also finds what contributes to that user trust.
“We looked at both trust and distrust, testing for them separately,” says first author Yang Cheng, an assistant professor of communication at North Carolina State University.
“…the better you think you are at sorting misinformation from accurate information, the more likely you are to trust Facebook.”
Broadly speaking, trust is when you expect a person or entity to behave in a positive way, whereas distrust is when you expect a person or entity to behave in a negative way. But in the context of this study, it’s also fair to think of trust as being more cognitive in nature (the way you think about an entity), whereas distrust is more intuitive (or the way you feel about an entity).
To begin addressing issues of trust and social media use, the researchers conducted a survey of 661 social media users in the United States. Survey questions addressed a variety of issues, including:
- The extent to which study participants trust Facebook;
- The extent to which they distrust Facebook;
- Information trustworthiness, or the extent to which they think items posted on Facebook are true;
- Information elaboration, or the extent to which they think about the consequences of misinformation on Facebook;
- Self-efficacy, or how good participants think they are at avoiding misinformation;
- Prescriptive expectancy, or the extent to which they think Facebook should be pro-active about addressing misinformation; and
- Intensity of Facebook use, or the extent to which they use and rely on Facebook.
The researchers found that trust was very strongly correlated with the intensity of Facebook use. Distrust, however, was not.
“This is an important lesson for communicators: you need to cultivate trust,” Cheng says.
But what builds trust?
The characteristic most strongly correlated with trust was self-efficacy.
“In other words, the better you think you are at sorting misinformation from accurate information, the more likely you are to trust Facebook,” Cheng says. “And the more you trust Facebook, the more likely you are to be a high-intensity Facebook user. Unfortunately, thinking that you are better than other people at identifying misinformation does not mean you are actually better than other people at identifying misinformation.”
The other variable that was positively correlated with trust in Facebook was information trustworthiness, or the extent to which people thought posts on Facebook were true.
“While our work highlights the importance of building trust, it also highlights the challenge this poses for a company like Facebook,” Cheng says. “Facebook can promote media literacy, but actual media literacy is not necessarily related to self-efficacy. And Facebook has not shown that it can ensure the posts on its platform are true.
“If people don’t trust Facebook, they’re less likely to spend as much time there, or to engage as fully with content on the site. And it remains unclear how much control Facebook has over the variables that contribute to trust in the platform.”
The other variables the researchers examined were both negatively correlated with trust in Facebook. In other words, the more people thought about the consequences of misinformation shared online, the less they trusted Facebook. And the more people thought Facebook should proactively work to limit misinformation, the less they trusted Facebook.
Again, neither of those variables are things that are inherently within Facebook’s control. However, one could hypothesize that increased efforts from Facebook to reduce misinformation on its platform could reduce the negative correlation between those variables and distrust in Facebook.
The paper appears in Online Information Review.
Source: NC State