Tuesday, July 8, 2025

#189 / Thinking About Polarization



 
Political polarization is a problem. I think lots of people - most people, probably - would agree with that. So, what causes the "political polarization" problem? What can be done about it (if anything)?

Roland Fryer, writing in The Wall Street Journal, has come up with an analysis worth thinking about. Fryer is an economics professor, so he titles his opinion essay, "The Economics of Polarization." To me, his insights, based on social science research, aren't really about "economics," very much. It looks to me like Fryer is describing "confirmation bias," more than anything else. 

In the column I have linked above, Fryer says the research he ultimately undertook, with a colleague, was stimulated by a continuing dispute that Fryer has with his wife. The wife likes to use her horn, proactively, and often, while driving. Fryer, himself, has long felt that this is not an effective way to prevent accidents, which is what his wife claims as a justification for her intensive use of the horn. 

Below, I present you with Fryer's description of how he, and his colleague, designed an experiment to test out why even solid "evidence" doesn't seem to convince those who already know what is right. On his way to work, with his wife driving, Fryer has had a dispute about his wife's horn use habits. Fryer thought it was clear, from an incident occurring on the road, that his wife's use of the horn was inappropriate and ineffective as an accident prevention technique:

As soon as she dropped me off on campus, I ran to my office to tell a fellow economist this anomaly I had observed. Was my wife irrational? Was I? Or did we need to think about inference and decision-making a bit differently? I confided first in Matthew Jackson, who specializes in social networks. He seemed as perplexed as I was and, because he knows my wife, offered up several interpretations that would make her seem more rational. Finally, he relented, and it became one of the guiding examples for us to think differently about how humans process information when there is uncertainty. 
In the simplest version of the model we developed, imagine that the truth is either A or B. Climate change either is or isn’t caused by human activity. The death penalty either deters crime or it doesn’t. No one really knows the truth—but we start with a prior belief about how plausible A and B seem. Each person observes a series of signals, information that suggests the truth might be A or B. Some signals are ambiguous and come as AB rather than A or B. 
If you were fully rational and able to set aside prior beliefs, you’d store the information in a sequence—A, B, AB, AB, A, AB, A, B, B—and add it up at the end: three points for A, three for B, and three ambiguous signals. 
But if you tend to align unclear evidence with your previous expectations, you would come away thinking your original instincts were right. If you construe all the “AB” signals as A (or B), you now think the evidence falls on your side by a 2-to-1 margin. Further observations of the world entrench that view rather than correcting it, because future ambiguous signals will have the same skew.... 
We explored the model’s implications in an online experiment with more than 600 subjects, modeled on a pioneering 1979 paper by Charles G. Lord and colleagues. First, participants were presented with questions about their beliefs on climate change and the death penalty. Then they read a series of summaries of research about each topic. After each summary, we asked participants if they thought the summary provided evidence for or against the topic on a 16-point scale. After all of the summaries were presented, we repeated the initial questions about their beliefs about the topic.
There was a very significant correlation between a subject’s prior belief and his interpretation of the evidence. More than half of our sample exited our experiment with more extreme beliefs than at the start, even though the evidence presented to them was neutral (emphasis added).

As I say, this seems like an explanation of "confirmation bias" to me. We all tend to believe, even in the face of contradictory evidence, that what we already think is correct, and that any evidence presented to us supports our existing beliefs. The image at the top of this blog posting makes clear the concept. To my mind, the most "chilling" statement, in the excerpt I have quoted, is this one: "more than half of our sample exited our experiment with more extreme beliefs than at the start, even though the evidence presented to them was neutral." 

If it is true that persons become more "extreme" in their preexisting political views, the more evidence that they receive on the topic at issue (and even if the evidence they receive is contradictory or neutral with respect to the views they already hold) then we are in a lot of potential trouble. Typically, we try to "persuade" people with whom we disagree, and most notably by providing them with evidence that they are "wrong." According to the Fryer experiment, debate and discussion aimed at "persuading" those with opposite views is actually counterproductive! In fact, the whole "persuasion" process is what is stimulating political polarization.

Well, is there something we can do about this? I am thinking, specifically, about our political disagreements over public policy options. Whether or not the death penalty "works," and whether or not global warming is caused by human activities, are both good examples of the kind of policy debates in which political polarization is, perhaps, made more extreme by efforts by those on each side of the debate to adduce "evidence" to support the policy position they already believe is "best." And there are, of course, LOTS of such public policy questions about which there is profound disagreement.

Here is one answer. This answer comes by way of a well-known phrase - and here's that phrase: Let's just "agree to disagree." In other words, if we decide that differences must or should be eliminated, in the context of our democratic politics, "polarization" will be increased as each "side" believes that it must override and "defeat" the other side. When disagreements are defined as questions of "truth, justice, or the American way," it is easy to conclude that the "other side" is not only "wrong," but is, actually, "evil," and thus must be defeated, eliminated, or rendered politically powerless.

That is the way that "political polarization" seems to work, and I think that's where we are, right now!

But what about the idea that there are lots of "opinions," and that this is just fine? What about the idea that it is, actually, totally acceptable for wives and husbands to disagree about safe driving techniques (and that divorce, therefore, is not required when one of the marital partners is using the horn in a way that the other finds objectionable)?

What about the idea that Americans can have different opinions about "climate change," and its causes, and about the desirability of tariff increases, etc., and that it is actually acceptable for us to disagree about the policy questions that face the nation? 

That sounds "nice," but if we have a system in which we "agree to disagree," then how do we decide whether or not to open the national parks to oil development, or to increase efforts to identify and deport anyone who has entered the United States without permission? How to we decide what actions to take about the real questions that the nation must address?

The Constitution does, in fact, provide a technique for making collective decisions that does not require that one "side" or another be "defeated." Debate and discussion takes place in Congress, and unless the President vetos the resolution decided upon by the Congress, that's what will happen. No "side" is ever permanently "defeated" or "eliminated." Polarization is reduced. 

When differences generate "sides" that insist that they are "right," and that the "other side" is "wrong," and that what is "wrong" must be eliminated, polarization is increased. But it is, actually, possible to live, love, and prosper in a society that can "agree to disagree," even as the political process makes decisions, and the nation takes action - not upon the basis that "right" has triumphed over "wrong," but on the basis that this is what was decided this time, and that we might well want to revisit the issues, later on, after a couple of years, and after our next election, has come and gone. 


No comments:

Post a Comment

Thanks for your comment!