There’s some very valuable insights in paper recently published in Nature - “The political effects of X’s feed algorithm”.
The researchers carried out an experiment where users were either exposed to the standard algorithmic feed on the X social network, or a chronological one for 7 weeks. Their political views over time were measured, alongside their engagement level with the platform. Both changed for users exposed to the algorithmic feed.
Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects.
In numbers:
Among these participants, those assigned to switch to the algorithmic feed were 5.2 percentage points less likely to reduce their X usage than those who remained on the chronological feed (95% confidence interval (CI): 0.7, 9.7; P = 0.024).
They were 4.7 percentage points more likely to prioritize policy issues considered important by Republicans, such as inflation, immigration and crime (95% CI: 0.7, 8.7; P = 0.023).
They were also 5.5 percentage points more likely to believe that the investigations into Trump are unacceptable, describing them as contrary to the rule of law, undermining democracy, an attempt to stop the campaign and an attack on people like themselves (95% CI: 0.8, 10.2; P = 0.022).
They were 7.4 percentage points less likely to hold a positive view of Ukrainian President Volodymyr Zelensky (95% CI: 1.8, 13.0; P = 0.009).
Finally, they were 3.7 and 2.3 percentage points more likely to follow any conservative account (95% CI: 0.5, 7.0; P = 0.025) and any political activist account (95% CI: −0.1, 4.8; P = 0.061) on X, respectively.
Some reasons why:
First, we confirmed that the algorithmic feed is more engaging.
…
Second, the algorithm promotes political content and, within that category, prioritizes conservative content.
…
Third, the algorithm demotes accounts of traditional news media and promotes those of political activists.
Turning off the algorithmic feed for those originally exposed to it didn’t undo the effect. Why?
We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects.
Which leads to the conclusion that:
….initial exposure to X’s algorithm has persistent effects on users’ current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.
Once again the supposed fears of a part of the alt right that social media companies are “brainwashing” everyone with “dangerous” left-wing liberal views proves to be the exact opposite of the truth. The use of at least the X algorithm are having their viewpoints shifted to those of the current extremely right-wing US administration.
This should be a wakeup call for politically-engaged funders and anyone who cares about civil society. It’s not that we need to have less conservative algorithms; it’s that whoever controls the algorithms has a disproportionate say over the electorate’s view of the world.