Updated research, out at the beginning of this year, shows that actually, humans are willing to listen to facts, and even change their mind, when the facts are presented to them.
It can be frustrating, you make your argument, you state your facts, but the person you are arguing with does not budge. It feels as if little things like facts count for nothing, with some people.
Some people! Try re-phrasing that – all people. Are you really open to facts such that you could change your mind on a topic you think it important, such as gun law, climate change, or on the benefits/costs of Brexit?
And for years, studies have shown that we are not so good with facts.
Dan Kahan, a leading scholar in the field of criminal law, has produced a paper which appears to show that we simply ignore facts if they contradict our preconceived notions.
He took a slightly tricky puzzle.
In Kahan’s experiment, he took a fictitious example of two groups of people comprising of individuals who had a skin rash. The two groups were not the same size. One group was given a cream. Here are the results of this fictitious experiment:
Dan Kahan Motivated Numeracy and Enlightened Self-Government.Rash got betterRash got worseGroup A: Patients who did use the cream22375Group B: Patients who didn’t use cream10721
He then asked people to select one of two possible conclusions relating to his fictitious study:
1: People who used the skin cream were more likely to get better than those who didn’t.
2: People who used the skin cream were more likely to get worse than those who didn’t.
The answer is not obvious, but if you look at the ratio of ‘rash got better’ ‘to rash got worse’, for each of the two groups, you realise that actually people who used the skin cream were more likely to get worse than those who didn’t. As a rule, those who were better at maths tended to get the answer right.
He then set precisely the same puzzle, but in a different context. Instead of rash and cream, the subject of the study concerned the link between carrying guns and crime in the US. In this case, reasoning went out of the window. If the numbers were fixed so that a quick look suggested carrying guns reduced crime – when in fact a more careful look revealed the opposite finding – it was irrelevant how good the test subjects were at maths. If they held liberal views on crime they were more likely to get the answer right. Then Kahan flipped the stats, to show the opposite finding. In this scenario those with more conservative views on carrying guns got the answer right.
They call it the backfire effect. In a study by Nyhan and Reifler (2010), focusing on how the issue of weapons of mass destruction in Iraq was received, they say that presenting respondents with facts can compound their ignorance.
You might recall, at the time, we were told that Saddam Husain was definitely lying, justifying the Iraq invasion, as he said there were no weapons of mass destruction.
But a paper by Thomas Wood at The Ohio State University and Ethan Porter, from George Washington University suggest that this backfire effect is not as serious as previously thought. Originally published in 2016, but updated on January 2nd 2018, the paper presents results from five experiments with more than 10,100 subjects and tested 52 issues of potential backfire. It states: “Across all experiments, we found no corrections capable of triggering backfire, despite testing precisely the kinds of polarised issues where backfire should be expected. Evidence of factual backfire is far more tenuous than prior research suggests.
“By and large, citizens heed factual information, even when such information challenges their ideological commitments.”
You can read the full paper here.