The digitized public in the grip of social media

by time news

2023-08-21 12:43:20

The functioning of social media platforms contributes – according to a frequent criticism – to the aggravation and polarization of political conflicts. In particular, the algorithms that determine which messages are displayed to which users are responsible for this. They are suspected of reinforcing existing opinions and attitudes on the one hand and, above all, promoting contact with like-minded people on the other by trying to track down and serve the preferences of users. This algorithmic amplification results in certain content becoming more visible in users’ feeds. It interacts with social reinforcement by users who give more visibility to specific content through sharing and reposting.

Hardly anyone knows how this works in detail, because the algorithms are trade secrets. The extent and consequences of the resulting “filter bubbles” and “echo chambers” therefore remain controversial, and until now they have been difficult to prove without direct access to usage data.

A collaboration between Meta, the parent company of Facebook and Instagram, and an international team of researchers is now enabling an inventory and targeted experiments for the first time. The “Facebook and Instagram Election Study” draws on usage data from the years 2020 and 2021 and thus covers, among other things, the hot phase of the last presidential election in the USA. The first studies resulting from this cooperation, published in Nature and Science, examine some of the suspected effects on the political opinion formation of 208 million American Facebook users and provide insights into the engine room of digitized publics.

Promising measures proved ineffective

In various experiments, the overview pages (“feeds”) of some users were manipulated in such a way that the consequences of targeted interventions could be observed. Three measures that were often seen as promising turned out to be largely ineffective. In a first experiment, posts shared by other users were hidden. As a result, users saw less political news and fewer posts from untrustworthy sources – but also felt less encouraged to react. In another study, messages from “like-minded” contacts with similar political views were deliberately reduced by a third.

This made the confrontation with other opinions more likely. In a third experiment, the algorithm was effectively turned off and users saw all messages from their contacts in reverse chronological order. However, this resulted in more content being displayed from sources that had repeatedly been flagged for misinformation, but are usually filtered out by the algorithm.

Natalia Wenzel-Warkentin Published/Updated: , Recommendations: 27 André Kieserling Published/Updated: , Recommendations: 9 Michael Hierholzer Published/Updated: , Recommendations: 12

After none of these experiments did the political attitudes of the affected users show significant differences to the respective control groups. However: The test subjects spent less time on the platforms than other users – obviously because these had become less attractive without algorithmic or social reinforcement.

Conservative niche with no equivalent

But that doesn’t mean at all that participation in the digital public has no consequences for everyone to the same extent. Another study analyzed the trajectories of political news—from availability to dissemination to response—and found clear ideological segregation: conservatives close to the Republican Party have their own niche, while those close to the liberals have their own niche , which are more in line with the Democratic Party, there is no equivalent. News consumption on the conservative side of the political spectrum is therefore much more homogeneous. If the criticism from conservative circles that the classic “mainstream media” has a liberal bias should be correct, the online world would be the desired counterbalance: at least on Facebook, conservative news sources dominate.

The fact that algorithmic and social reinforcement has hardly any measurable political effects should please the meta group and serve as an argument against more regulation. Strictly speaking, however, the result simply means that the influence of social media on political communication cannot simply be switched on or off. Without algorithms and social influence, the platforms will become less interesting, but not necessarily better.

Science, 381(6656), 392–408, https://www.science.org/toc/science/381/6656; Nature, 620(7972), 137–144,

#digitized #public #grip #social #media

You may also like

Leave a Comment