If there’s any single talking point conservative politicians have agreed upon in the last few years, it’s that social media is unfairly targeting them for their opinions. A deep study of political speech on Twitter (h/t TechDirt), carried out by researchers from MIT and Yale, considers that being conservative isn’t the problem; instead, it’s conservative love of sharing misinformation that leads to the impression of bias.
The researchers — Mohsen Mosleh, David Rand, Qi Yang, and Tauhid Zaman — found 9,000 “politically engaged” Twitter accounts, half Republican and half Democratic, and observed their behavior during the six months following the 2020 presidential election. And they did find that far more of the Republican accounts were banned than the Democratic ones. About 35 percent of the conservative accounts were banned in those six months, while only about 8 percent of the left-leaning accounts were banned.
It would be easy to simply interpret these results as genuine political bias on Twitter’s part, but the researchers took the time to dig deeper and found the problem wasn’t just being conservative — it was that conservatives were sharing much more misinformation than their more liberal counterparts.
All about the lies —
This is by no means the first time research has shown that Twitter does not make moderation decisions based on a users’ political beliefs. In fact, Twitter itself has found that its site amplifies right-leaning content (though the company knows little about why that’s the case). What makes the study unique is both its relatively large sample size and its focus on what content, exactly, users are being banned for.
“The observed asymmetry could be explained entirely by the tendency of Republicans to share more misinformation,” the research paper reads. “While support for action against misinformation is bipartisan, the sharing of misinformation — at least at this historical moment — is heavily asymmetric across parties.”
While the definition of “misinformation” is hotly debated, the researchers relied on previous studies and expert opinions to inform how they distinguished between truth and lies.
But everyone agrees on this —
Before they’d even begun watching these Twitter accounts, the research team surveyed 4,900 people about whether or not they believed social media platforms were taking enough action against the spread of misinformation.
And here’s the thing: both Democratic and Republican respondents largely agreed that these platforms should take action to reduce misinformation. We’re talking a “yes” rate of 67.2 percent from Republican participants here.
The high rate of bans in the study can likely be attributed to the time period chosen for the study — the fallout of the 2020 election was intense across social media. The conclusions drawn from that time stand, though, just on a smaller scale.