Research shows that hostility towards opposing groups drives engagement on social media

When browsing social media sites such as Twitter and Facebook, you may have experienced being annoyed by the number of posts about people and political parties with different political opinions. A new study that analyzed more than 2.7 million posts on social media showed that posts about opposing political groups are more likely to be shared.
Out-group animosity drives engagement on social media | PNAS

Hostility towards outsiders motivates engagement on social media
https://www.psypost.org/hostility-towards-outsiders-motivates-engagement-on-social-media/
On social media such as X and Facebook, a wide variety of users share various content and communicate in real time. In modern society, these social media have become major sources of information alongside newspapers and television news programs.
In newspapers and news broadcasts, all viewers see the same content, but in social media, algorithms select the content users see and recommend things that suit their personal tastes and preferences, so no two users ever see the exact same content.
These algorithms aim to recommend content that users are likely to enjoy and keep them interested in the platform for as long as possible. The mechanism of recommending only content that matches a user's preferences may be harmless in the context of music and movies, but there are concerns that if an echo chamber phenomenon occurs where only people with the same opinions as you gather on political topics, it could worsen political polarization and lead to social division.
This time, a research team led by Dr. Steve Ratty, who studies misinformation and political polarization on social media at the University of Cambridge in the UK, investigated whether 'hostility toward politically opposing groups' increases engagement on social media. The research team hypothesized that 'in a polarized society, expressing hostility toward opposing groups may be more effective in demonstrating partisan identity than praising one's own group.'

In the first analysis, we analyzed social media posts from American liberal news media such as
The analysis found that posts containing negative sentiment words tended to increase shares and reposts by 5-8%. However, for conservative news media on Facebook, posts containing negative sentiment words were associated with roughly 2% fewer shares and reposts. Additionally, positive sentiment words were associated with a 2-11% decrease in shares or reposts overall.
Even more striking, posts that included 'references to politically opposing groups' were 35-57% more likely to be shared or reposted for each additional word referring to the opposing group. These posts were reported to provoke emotions such as 'anger' and 'ridicule' from users across both liberal and conservative news media.
The second analysis focused on Facebook and X accounts of US Democratic or Republican members of Congress, analysing Facebook posts (825,424) and X posts (1,078,562). The results again showed that each additional negative sentiment word increased the likelihood of a post being shared or reposted by 12-45%. And each additional word referring to a politically opposing group increased the likelihood of a post being shared or reposted by 65-180%. Posts that included references to opposing groups elicited stronger 'anger' responses from users.

While the findings of this study shed light on the content posted on social media and how users respond to it, it is important to note that platform algorithms also play a major role in how users respond to posts. This is because, although it is up to users to decide whether or not to interact with the content they see, the content that appears in their feeds and timelines is determined by algorithms in the first place.
The research team pointed out that Facebook's 2018 algorithm change , which placed emphasis on engagement such as sharing and commenting on posts, may have created an environment in which posts about politically opposing groups were more likely to spread. 'Ironically, posts about politically opposing groups were more effective at generating comments and reactions, with angry reactions being the most popular in our study. In other words, the algorithm change, made in the name of bringing people closer together, may have helped prioritize posts that contain antipathy toward out-groups,' the research team argued.
in Web Service, Science, Free Member, Posted by log1h_ik