Twitter Research Shows Its Algorithm Favors Conservative Opinions

Twitter Flock allows sharing tweets with up to 150 select users
x

'Twitter Flock' allows sharing tweets with up to 150 select users

Highlights

A Twitter blog post reveals that the Twitter algorithm promotes content that leans to the right more often than to the left, but the reasons for this remain unclear. The findings were based on an internal study on the algorithmic amplification of political content on Twitter.

A Twitter blog post reveals that the Twitter algorithm promotes content that leans to the right more often than to the left, but the reasons for this remain unclear. The findings were based on an internal study on the algorithmic amplification of political content on Twitter.

During the study, Twitter analyzed millions of tweets posted between April 1 and August 15, 2020. These tweets were from media outlets and elected officials in Canada, France, Germany, Japan, Spain, the United Kingdom, and the United States. In all countries studied except Germany, Twitter found that right-wing accounts "receive more algorithmic amplification than political left." It also found that trending content to the right of the media benefits from the same bias.

Twitter says it does not know why the data suggests that its algorithm favours right-oriented content, noting that it is "a significantly more difficult question to answer as it is a product of the interactions between people and the platform." However, it may not be a problem with the Twitter algorithm specifically: Steve Rathje, PhD. A candidate who studies social media, published the results of his research explaining how divisive content about outside political groups is more likely to go viral.

The Verge reached out to Rathje to get his thoughts on the Twitter findings. "In our study, we also were interested in what kind of content is amplified on social media and found a consistent trend: negative posts about political outgroups tend to receive much more engagement on Facebook and Twitter," Rathje stated. "In other words, if a Democrat is negative about a Republican (or vice versa), this kind of content will usually receive more engagement."

If we take Rathje's research into account, this could mean that right-leaning posts on Twitter successfully elicit more outrage, resulting in amplification. Perhaps the Twitter algorithm problem is related to the promotion of toxic tweets rather than a specific political bias. And as we mentioned earlier, the Twitter investigation said that Germany was the only country that did not experience the bias of the trending algorithm to the right. It could be related to Germany's agreement with Facebook, Twitter and Google to eliminate hate speech within 24 hours. Some users even change their country to Germany on Twitter to prevent Nazi images from appearing on the platform.

Frances Haugen, the whistleblower who leaked several internal Facebook documents, claims that Facebook's algorithm favours hate speech and divisive content. Twitter could easily be in the same position, but it is openly sharing some of the internal data examinations before there is a possibility of a leak.

Rathje pointed to another study that found that moral outrage amplified viral posts from both liberal and conservative points of view, but was more successful coming from conservatives. He says that when it comes to features like an algorithmic promotion that lead to virality on social media, "more research needs to be done to examine whether these features help explain the amplification of right-wing content on Twitter." If the platform delves deeper into the problem and opens access to other researchers, it could better handle the divisive content at the heart of this problem.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT