Why Facebook Makes Americans Feel Politically Polarized

By Equating Political and Social Identities, the Platform Makes Our Divides Feel Unbridgeable

WhatsApp and Facebook app icons on an iPhone. Courtesy of Karly Domb Sadof/Associated Press.

It’s not hard to understand why moderate opinions are absent from the ranks of cable television talking heads, satellite radio hosts, and newspaper opinion editorialists. Moderation doesn’t sell. Americans have short attention spans, and conflict is more interesting than cooperation. Inciting anger means increasing profits.

But why aren’t moderate viewpoints more prominent among the forms of mass communication to which the largely moderate public contributes—namely social media sites like Facebook? And is there anything that can be done to make such platforms more inviting places for the exchange of less extreme opinions?

The absence of moderate voices—an empirical reality confirmed by research reflects a much larger problem about the politically polarizing effects of using social media. These sites have radically changed how people communicate with each other about politics. In particular, Facebook interlaces political content into a broader web of information about the lives and values of users.

This radical change to the way that people express their political identities, access information, and communicate with each other fosters the development of increasingly negative feelings toward people who hold different political opinions. Scrolling through the Facebook news feed triggers a cascade of processes that result in negative attitudes about those who disagree with us politically.

Inherent features of Facebook, paired with the norms of how people use the site, heighten awareness of political identity. My research shows that a multitude of nonpolitical information—such as where we eat, where we shop, and our favorite music—can send signals about our political views. Once we’ve recognized someone as a member of our out-group—a group outside our own—we make biased inferences about their political views. Facebook users judge other users with whom they disagree to be less politically knowledgeable and to use less reliable news sources.

Altering who participates in the online public sphere is not just a question of civility for civility’s sake. The absence of moderate voices contributes to the distorted view that most Americans have of public opinion. Americans believe they are more polarized and divided than they actually are, and that belief may discourage people from believing that compromises can be found for our toughest policies.

How can this situation be remedied? Both the public and the companies running these platforms have roles to play. The most successful solutions involve tweaking platforms to highlight the perspective of users who are part of the solution to polarization, and not part of the problem.

Of course, this answer oversimplifies things. Reasonable people might disagree about whether social media companies should alter their algorithms to favor particular kinds of communication, not just to maximize user engagement. Clandestine changes to the algorithm would likely cause a backlash over fears that a corporation was attempting to police the public sphere.

But Facebook might avoid this outcry if it were to introduce the feature change with great fanfare about wanting to make political communication more civil. It also could try to change the behavior of individual users, but this is harder than you might think.

The good news is that, while moderation is missing in our media, that’s not because of the extinction of moderates themselves. Despite the popular narrative about the extent of political discord in our country, social science research shows more consensus and temperance than you’d expect in what Americans actually believe regarding hot button issues. Moderates do in fact exist in large numbers in our country.

The trouble is that moderation tends to go hand in hand with lower levels of interest in politics. Many people identify as independents or as ideologically moderate because they don’t know enough about policy issues to form stronger opinions, let alone articulate them. Thus, on average, middle-of-the-roaders are less interested and less knowledgeable about politics.

Moderate voices are not only quiet on Facebook; they appear to be silenced everywhere. Research suggests that many people find political discussions uncomfortable and that people worry about the damage to their social relationships if they engage about politics. Those with less confidence in the accuracy of their viewpoints are more sensitive to being wrong, and they may be hesitant to speak up because they don’t want to be critiqued publicly for their opinion. Add to this the vitriolic norms of social media, and it’s no surprise that, according to the Pew Research Center, majorities of users on social media sites find political interactions stressful and frustrating.

Thus, what won’t work is asking or incentivizing moderates to speak up more about their political opinions. As a group, they are less inclined to do so, and even if they did, it is not clear that their fellow users would recognize the subtleties of their opinions. One disturbing finding from my research is that the Facebook platform is well-designed to foster the out-group homogeneity effect. That term refers to how the very act of identifying others as members of an out-group increases our propensity to think that they are all the same. In other words, people are inclined to attribute too much extremism and consistency to the political views of their opponents, attributing strong political identities and viewpoints to those people even though they don’t actually hold those beliefs.

So the solution to moderating the polarizing forces of social media starts with staying away from politics, at least on the surface.

Interestingly, Facebook users are usually accurate in inferring the political identities of other users based on even the nonpolitical content they post. This mapping of nonpolitical cues to political identities reinforces the idea of a large gulf between political parties.

But, in reality, not everyone who drives a Prius is a Democrat, nor do all pickup truck drivers identify as Republicans. So, to alter the polarizing effects of social media, we need to make the signals linking social characteristics and policy preferences noisier. Kale-eating conservatives and country music-loving liberals must speak up and share more about their lives in order to highlight the full range of diversity within each political party. Having partisans complicate their own stories could contribute to a reduction in the perception of social polarization between those with different political viewpoints.

Moderates have an equally important role to play, a role that does not force them out of their political comfort zones. Moderates often demonstrate tolerance for differing viewpoints in what they read, like, and comment upon. Social media companies need to alter their platforms in ways that reward and create incentives for these users who play nicely in the sandbox.

For example, Facebook could alter the news feed algorithm to lower relevancy scores for emotional political speech, ensuring that dispassionate communication is more likely to circulate on the site. Worried about fake news? Research suggests that those without a strong partisan attachment are more discerning about the quality of their news. Thus, the news that moderates flag as questionable may be more likely to be motivated by genuine concern about the content as opposed to partisan strategizing to antagonize the other side.

Another solution stems from the finding that the people most inclined to “like” content posted by people in the opposing political party are frequent Facebook users who are the least partisan and politically interested. We could harness the judgment of these tolerant moderates by altering the news feed algorithm to assign a higher relevancy score to political content that these users like. This might be particularly useful because users with moderate opinions who do not frequently post political content are more likely to be situated in networks with a diversity of political opinions. These users could serve as a bridge to help partisans break out of their filter bubbles.

The problem of polarization on Facebook is multifaceted, and no single solution operating in isolation will remedy the root of the issue. While some of the causes of polarization on social media are best left to governments (such as eliminating foreign interference in elections) or to private companies (such as extirpating hate speech from the sites), we shouldn’t abdicate our own responsibility as citizens to contribute to fixing the problem.


×

Add a Comment