How partisans see facts through different eyes
Those who want to restrict travel from Muslim countries or sales of assault weapons use one rationale to buttress their arguments and a different one to dismiss their opponentsâ, according to new research from CU Boulder
In our politically polarized country, we often hear this common refrain: If people on both sides of the aisle could simply look at the same facts, theyâd be able to see eye-to-eye, have measured discussions and enact reasonable laws and policies.
But new research from the Âé¶čÊÓÆ”, published in March in the journal Cognition, suggests that when people with differing political views are provided with the same statisticsâand they believe that those facts are accurateâthey prioritize the information differently, based on their existing opinions.
The findings provide one explanation for the sharply diverging opinions people have when it comes to polarizing policiesâweâre literally perceiving the facts differently.
âWe often assume when dealing with the partisan divide that if we could give everyone the same information and get them to believe in the accuracy of that information, we would reduce partisan conflict,â said Leaf Van Boven, a CU Boulder professor of psychology and neuroscience. âBut what this research shows is that even when you give people the same information, they can have very different partisan reactions to that underlying information.â
Van Boven and a team of co-authors set out to study our reasoning using âconditional probabilities,â or the likelihood that something will occur based on one or more conditions having occurred.
More specifically, they looked at what statistics people considered to be the most important when contemplating policies that restrict broad categories of people or actions to lower the risk of rare events, such as a travel ban for immigrants from majority-Muslim countries to reduce terrorist attacks and a ban on the sale of assault weapons to curb mass shootings.
The researchers were inspired, in part, by how politicians and pundits used conditional probabilities to defend restrictive policies. After the Sept. 11 terrorist attacks, for example, conservative commentator Ann Coulter advocated for the expulsion of Muslim immigrants from the country, writing that âall terrorists are Muslims.â
âWeâre particularly interested in these policies involving rare events because policy makers often use conditional probabilities to explain why a policy is useful, and thereâs a lot of research suggesting that people actually have a tough time thinking about conditional probabilities,â said Jairo Ramos, CU Boulder a graduate student in social psychology and one of the studyâs co-authors.
When evaluating the effectiveness of a policy intended to reduce terrorism, for instance, itâs more relevant to consider the vanishingly small fraction of Muslim immigrants who commit terrorist attacks, rather than the fraction of immigrant terrorists who come from Muslim countries, the researchers write.
In essence, because the percentage of Muslim immigrants who are terrorists is extremely small, banning all Muslims from entering the country would reduce an extremely small threat to a somewhat smaller threat, the researchers explain. And yet, many people are motivated by statements like the one made by Coulterâthat the proportion of immigrant terrorist attacks are committed by Muslims is relatively high.
The researchers used a less-polarizing example to clarify this point: professional basketball players. While a majority of NBA players are African American, only a small fraction of African American males play in the NBA.
âYou would never try to look for potential NBA recruits by starting with all African American malesâthat would be an incredibly wasteful strategy,â Van Boven said. âBut itâs really the same thing we do when we first look at Muslim immigrants.â
Considering probabilities
To explore this phenomenon in the context of politically polarized policies, the researchers asked more than 500 American adults to review a list of statistics related to terrorism or mass shootings, then select which statistic they considered the most important for evaluating policies meant to reduce the risk of those events. Participants also considered this question from the perspective of an unbiased expert and someone with an opposing viewpoint to their own. (Co-authors in Israel ran a similar study using a policy to expel asylum seekers from Tel Aviv to reduce crime.)
As suspected, participants selected the probability that supported their existing stance on the policy.
For example, when considering a Muslim travel ban, a supporter of the policy was more likely to point to the fact that 72 percent of immigrants who commit terrorist attacks come from Muslim countries. But an opponent of that same policy was likely to prioritize the probability that an immigrant from a Muslim country is a terrorist is 0.00004 percent.
People are basically starting with the outcome they would like and then looking for evidence to support that outcome. When they stop and think like an expert, they can interrupt that process a little bit. ... We speculate that you first ask what the evidence showsâyou start with the data and then reason from that perspective.â
Similarly, someone who supported an assault weapons ban placed more value on the fact that two-thirds of mass shootings were committed by people who owned assault weapons. An opponent of that same policy pointed to the probability that of the 12 million American adults who own assault weapons, just four committed a mass shooting.
Adopting the perspective of an unbiased expert mitigated some of this polarization, though not completely.
Importantly, the researchers found that both Democrats and Republicans placed more emphasis on probabilities that aligned with their existing views. Thatâs because people tend to approach these questions like âintuitive politiciansâ rather than statisticians, Van Boven said.
âPeople are basically starting with the outcome they would like and then looking for evidence to support that outcome,â he said. âWhen they stop and think like an expert, they can interrupt that process a little bit. When you think like an expert, we speculate that you first ask what the evidence showsâyou start with the data and then reason from that perspective.â
Another important takeaway from the experiment is that people can agree about the relevance of probabilities and still have different policy stances, based on their own underlying values. For instance, even if someone recognizes that the majority of assault weaponsâ owners do not commit mass shootings, they might still want to ban assault weapons.
âSomeone might say that the value of reducing mass shootings even by the smallest amount is worth requiring all assault weapons owners to give up their weapons, or that any reduction in the number of terrorist attacks is worth banning all Muslim immigrants from entering the countryâthose might still seem like worthwhile tradeoffs to someone,â Van Boven said.
Acknowledging biases
Van Boven suggests how can we apply these new findings to our own lives: For starters, if youâre trying to be more open-minded and less biased during political discussions, try mentally stepping into the shoes of an unbiased expert or statistician. Look at the numbers and see where they lead you.
âEven in these very emotional, high-conflict partisan topics, we can and should approach it through the lens of statistical reasoning,â Van Boven said. âWe really should start by asking what is the likelihood of these kinds of risks?â
Another real-world takeaway: Acknowledge that we all process statistics in this biased way, not just people on the other side of the table.
âOn these worrisome, pressing issues of the day, we end up stuck in inaction because of the tendencies we all have,â Van Boven said. âItâs very easy to blame the other side and say, âTheyâre not thinking carefully and theyâre being irrational and unreasonable and thatâs why we canât have sensible policies.â But really, itâs all of us.â