Thursday, September 15, 2016
Neither intelligence nor education can stop you from forming prejudiced opinions – but an inquisitive attitude may help you make wiser judgements.
http://www.bbc.com/future/story/20160907-how-curiosity-can-protect-the-mind-from-bias
• by Tom Stafford
8 September 2016
Ask a left-wing Brit what they believe about the safety of nuclear power, and you can guess their answer. Ask a right-wing American about the risks posed by climate change, and you can also make a better guess than if you didn’t know their political affiliation. Issues like these feel like they should be informed by science, not our political tribes, but sadly, that’s not what happens.
Psychology has long shown that education and intelligence won’t stop your politics from shaping your broader worldview, even if those beliefs do not match the hard evidence. Instead, your ability to weigh up the facts may depend on a less well-recognised trait – curiosity.
The political lens
It is a mistake to think that you can somehow ‘correct’ people’s views on an issue by giving them more facts
There is now a mountain of evidence to show that politics doesn’t just help predict people’s views on some scientific issues; it also affects how they interpret new information. This is why it is a mistake to think that you can somehow ‘correct’ people’s views on an issue by giving them more facts, since study after study has shown that people have a tendency to selectively reject factsthat don’t fit with their existing views.
This leads to the odd situation that people who are most extreme in their anti-science views – for example skeptics of the risks of climate change – are more scientifically informed than those who hold anti-science views but less strongly.
People who have the facility for deeper thought about an issue can use those cognitive powers to justify what they already believe
But smarter people shouldn’t be susceptible to prejudice swaying their opinions, right? Wrong. Other research shows that people with themost education, highest mathematical abilities, and the strongest tendencies to be reflective about their beliefs are the most likely to resist information which should contradict their prejudices. This undermines the simplistic assumption that prejudices are the result of too much gut instinct and not enough deep thought. Rather, people who have the facility for deeper thought about an issue can use those cognitive powers to justify what they already believe and find reasons to dismiss apparently contrary evidence.
It’s a messy picture, and at first looks like a depressing one for those who care about science and reason. A glimmer of hope can be found in new research from a collaborative team of philosophers, film-makers and psychologists led by Dan Kahan of Yale University.
Kahan and his team were interested in politically biased information processing, but also in studying the audience for scientific documentaries and using this research to help film-makers. They developed two scales. The first measured a person’s scientific background, a fairly standard set of questions asking about knowledge of basic scientific facts and methods, as well as quantitative judgement and reasoning. The second scale was more innovative. The idea of this scale was to measure something related but independent – a person’s curiosity about scientific issues, not how much they already knew. This second scale was also innovative in how they measured scientific curiosity. As well as asking some questions, they also gave people choices about what material to read as part of a survey about reactions to news. If an individual chooses to read about science stories rather than sports or politics, their corresponding science curiosity score was marked up.
Armed with their scales, the team then set out to see how they predicted people’s opinions on public issues which should be informed by science. With the scientific knowledge scale the results were depressingly predictable. The left-wing participants – liberal Democrats – tended to judge issues such as global warming or fracking as significant risks to human health, safety or prosperity. The right-wing participants – conservative Republicans – were less likely to judge the issues as significant risks. What’s more, the liberals with more scientific background were most concerned about the risks, while the conservatives with more scientific background were least concerned. That’s right – higher levels of scientific education results in a greater polarisation between the groups, not less.
So much for scientific background, but scientific curiosity showed a different pattern. Differences between liberals and conservatives still remained – on average there was still a noticeable gap in their estimates of the risks – but their opinions were at least heading in the same direction. For fracking for example, more scientific curiosity was associated with more concern, for both liberals and conservatives.
The team confirmed this using an experiment which gave participants a choice of science stories, either in line with their existing beliefs, or surprising to them. Those participants who were high in scientific curiosity defied the predictions and selected stories which contradicted their existing beliefs – this held true whether they were liberal or conservative.
And, in case you are wondering, the results hold for issues in which political liberalism is associated with the anti-science beliefs, such as attitudes to GMO or vaccinations.
So, curiosity might just save us from using science to confirm our identity as members of a political tribe. It also shows that to promote a greater understanding of public issues, it is as important for educators to try and convey their excitement about science and the pleasures of finding out stuff, as it is to teach people some basic curriculum of facts.
--