Political bias on Twitter comes from users, not the platform

"We hope this study increases awareness among social media users about the implicit biases of their online connections and their vulnerabilities to being exposed to selective information, or worse, such as influence campaigns, manipulation, misinformation, and polarization," says Diogo Pacheco. (Credit: Getty Images)

Political bias on Twitter emerges from users, not the platform itself, according to a new study.

In this era of political polarization, many accuse online social media platforms such as Twitter of liberal bias, intentionally favoring and amplifying liberal content and users while suppressing other political content.

But the new study finds this is not the case. Political biases, the researchers found, stem from the social interactions of our accounts—we receive content closely aligned with whatever our online friends produce, especially our very first online friends. Also, political biases on Twitter favor conservative content.

“Our main finding is that the information Twitter users see in their news feed depends on the political leaning of their earliest connections,” says coauthor Filippo Menczer of Indiana University. “We found no evidence of intentional interference by the platform. Instead, bias can be explained by the use, and abuse, of the platform by its users.”

To uncover biases in online news and information to which people are exposed on Twitter, the researchers deployed 15 bots, called “drifters” to distinguish their neutral behavior from other types of social bots on Twitter. The drifters mimicked human users but were controlled by algorithms that activated them randomly to perform actions.

After initializing each bot with one first friend from a popular news source aligned with the left, center-left, center, center-right, or right of the US political spectrum, the researchers let the drifters loose “in the wild” on Twitter.

The researchers collected data on the drifters daily. After five months, they examined the content consumed and generated by the drifters, analyzing the political alignment of the bots’ friends and followers and their exposure to information from low-credibility news and information sources.

The research revealed that the political alignment of an initial friend on social media has a major impact on the structure of a user’s social network and their exposure to low-credibility sources.

“Early choices about which sources to follow impact the experiences of social media users,” Menczer says.

The study found that drifters tended to be drawn to the political right. Drifters with right-wing initial friends were gradually embedded into homogeneous networks where they were exposed to more right-leaning and low-credibility content. They even started to spread right-leaning content themselves. They also tended to follow more automated accounts.

Because the drifters were designed to be neutral, the partisan nature of the content they consumed and produced reflects biases in the “online information ecosystem” created by user interactions, according to Menczer.

“Online influence is affected by the echo-chamber characteristics of the social network,” he says. “Drifters following more partisan news sources received more politically aligned followers, becoming embedded in denser echo chambers.”

To avoid getting stuck in online echo chambers, users must make extra efforts to moderate the content they consume and the social ties they form, according to coauthor Diogo Pacheco, a former postdoctoral fellow at the Center for Complex Networks and Systems Research at Indiana University-Bloomington who is now a lecturer in computer science at Exeter University.

“We hope this study increases awareness among social media users about the implicit biases of their online connections and their vulnerabilities to being exposed to selective information, or worse, such as influence campaigns, manipulation, misinformation, and polarization,” says Pacheco. “How to design mechanisms capable of mitigating biases in online information ecosystems is a key question that remains open for debate.”

The study appears in the journal Nature Communications. The authors are a team of researchers from the Observatory on Social Media (OSoMe, pronounced awesome) at Indiana University-Bloomington, led by Menczer, who is director of OSoMe and a professor of informatics and computer science at the Luddy School of Informatics, Computing, and Engineering. The Observatory on Social Media at IU Bloomington gets support in part from the Office of the Vice Provost for Research.

Source: Indiana University