“it’s often difficult to understand which comes first: a polarized situation or the social media that aggravates that situation. Rather, it’s become a self-reinforcing system.”
…
“Animosity toward members of opposing parties is very high even though our divisions over policy preferences don’t appear to have grown, according to new research published in Science magazine. The paper brings together a number of different studies on the topic and is written by scholars from six disciplines who found that, these days, we’re more likely to hate the opposing side and consider them to be “different,” “dislikable,” and “immoral.” The result is a “political sectarianism” in which one’s party identity seems to come first, before policy, religion, or common ground. Political identity, in turn, shapes our other views instead of the other way around. For example, after seeing a clip of Donald Trump espousing a liberal policy, followers exhibited a more liberal attitude, according to the paper, which presumes Democrats would do the same for their political leaders.”
…
“The results of this kind of alignment are disastrous for a functioning democracy. As the researchers argue, “holding opposing partisans in contempt on the basis of their identity alone precludes innovative cross-party solutions and mutually beneficial compromises.””
…
“Then there’s distrust — encouraged by the president — of facts and journalism organizations, which are necessary to protect democracy. A series of Pew Research Center polls shows that Republicans rely on and trust fewer news sites for politics than they used to, with Fox News, Trump’s mouthpiece and a fount of disinformation, being one of few sources they regularly read and believe. However, research by Andy Guess, assistant professor of politics and public affairs at Princeton University, looks at web traffic rather than people’s survey responses to reveal that there’s considerable and consistent overlap in media consumption between the parties, except among a smaller set of extremists. This suggests many people might be reading the same sources but coming to totally different conclusions. Wildly divergent interpretations of the same news is a more difficult problem to fix.”
…
“Hyperpartisanship, tense societal factors, and divergent news diets — or at least divergent interpretations of the news — are then fed back through social media, which is likely amplifying our divisions. We don’t know exactly how the social media algorithms work that select what information we see because the technology is a black box controlled by the respective social media company that built it.
What we do know is that Facebook has put less of an emphasis on news and more on engagement, and that posts with strong, emotional language have more engagement. We also know Facebook has continually promoted Groups since 2016, which can function as their own echo chambers, even without algorithmic help. YouTube, whose algorithms like other platforms were designed to make people spend more time on the site, has been shown to radicalize people through inflammatory messaging. Most recently, it has been awash in election misinformation.”
…
“The share of Americans who often get their news from social media grew 10 percentage points to 28 percent last year, according to Pew. Those who mainly get their news that way were also less informed about current events and more likely to have been exposed to conspiracy theories.”
…
“A new study from the University of Virginia found increased Facebook usage among conservatives is associated with reading more conservative sites than they normally do. The effect was less dramatic among liberals.
The study’s authors conjectured that the way Facebook works might have something to do with this outcome. In addition to algorithms favoring engagement, the very structure of Facebook limits who we talk to: You have to “friend” others to see their posts, meaning you’re less likely to see people outside of your real-life friends and family, who are more likely to have similar lives and viewpoints. Facebook also tweaked its algorithms after the 2016 election to promote posts from friends and family and show far fewer posts from news outlets, which likely further contributed to filter bubbles and division.”
…
“Research highlighted in the Wall Street Journal suggests that people on social media do see opposing viewpoints. But since sites like Facebook are calibrated to highlight posts that elicit reactions, we’re seeing the most acerbic of opposing views, which can lead people to be even more repelled by them. The result is even more entrenched viewpoints and more polarization.”
…
““It’s not just a matter of coming into contact with the other side,” Pariser told Recode about how his conception of filter bubbles has changed since he first coined the term. “It’s doing so in a way that leads us to greater understanding.””
https://www.vox.com/recode/21534345/polarization-election-social-media-filter-bubble