The problem with facts is that in this day and age they are not enough. My title for today is shamelessly plagiarised from an article by Tim Harford, originally published in the Financial Times.
I was born just before Christmas 1953. Way back then:
Scientists were publishing solid evidence of a link between smoking and cancer. From the viewpoint of Big Tobacco, more worrying was that the world’s most read publication, The Reader’s Digest, had already reported on this evidence in a 1952 article, “Cancer by the Carton”. The journalist Alistair Cooke, writing in 1954, predicted that the publication of the next big scientific study into smoking and cancer might finish off the industry.
It did not. PR guru John Hill had a plan — and the plan, with hindsight, proved tremendously effective. Despite the fact that its product was addictive and deadly, the tobacco industry was able to fend off regulation, litigation and the idea in the minds of many smokers that its products were fatal for decades.
Is the link with the 21st century assault on evidence based policy making by the Trump administration in the United States obvious to you yet? If not, Tim continues:
So successful was Big Tobacco in postponing that day of reckoning that their tactics have been widely imitated ever since. They have also inspired a thriving corner of academia exploring how the trick was achieved. In 1995, Robert Proctor, a historian at Stanford University who has studied the tobacco case closely, coined the word “agnotology”. This is the study of how ignorance is deliberately produced; the entire field was started by Proctor’s observation of the tobacco industry. The facts about smoking — indisputable facts, from unquestionable sources — did not carry the day. The indisputable facts were disputed. The unquestionable sources were questioned. Facts, it turns out, are important, but facts are not enough to win this kind of argument.
Please read the Tim’s article in its entirety, but towards the end he mentions that:
There’s a final problem with trying to persuade people by giving them facts: the truth can feel threatening, and threatening people tends to backfire. “People respond in the opposite direction,” says Jason Reifler, a political scientist at Exeter University. This “backfire effect” is now the focus of several researchers, including Reifler and his colleague Brendan Nyhan of Dartmouth.
All this adds up to a depressing picture for those of us who aren’t ready to live in a post-truth world. Facts, it seems, are toothless. Trying to refute a bold, memorable lie with a fiddly set of facts can often serve to reinforce the myth. Important truths are often stale and dull, and it is easy to manufacture new, more engaging claims. And giving people more facts can backfire, as those facts provoke a defensive reaction in someone who badly wants to stick to their existing world view. “This is dark stuff,” says Reifler. “We’re in a pretty scary and dark time.”
Is there an answer? Perhaps there is.
We know that scientific literacy can actually widen the gap between different political tribes on issues such as climate change — that is, well-informed liberals and well-informed conservatives are further apart in their views than liberals and conservatives who know little about the science. But a new research paper from Dan Kahan, Asheley Landrum, Katie Carpenter, Laura Helft and Kathleen Hall Jamieson explores the role not of scientific literacy but of scientific curiosity.
Unfortunately Tim provides no link to the learned article he references, but we can put that right! Here it is:
According to Kahan et al. in their introduction:
This article describes evidence suggesting that science curiosity counteracts politically biased information processing. This finding is in tension with two bodies of research. The first casts doubt on the existence of “curiosity” as a measurable disposition. The other suggests that individual differences in cognition related to science comprehension—of which science curiosity, if it exists, would presumably be one—do not mitigate politically biased information processing but instead aggravate it. The article describes the scale-development strategy employed to overcome the problems associated with measuring science curiosity. It also reports data, observational and experimental, showing that science curiosity promotes open-minded engagement with information that is contrary to individuals’ political predispositions. We conclude by identifying a series of concrete research questions posed by these results.
This all sounds curiouser and curiouser, so let’s go deeper down the rabbit hole shall we?
In less than two decades, politically motivated reasoning has assumed an imperial reach over the study of mass political opinion formation. It has driven to the periphery theories emphasizing rational choice dynamics, heuristic information processing, public-spirited idealism, and popular disengagement. It has colonized countless individual topics from group polarization to source-credibility effects, from biased information search to the effects of factual misinformation.
The final frontier that scholars have yet to fully chart, however, concerns individual differences. Who is most vulnerable to the tendency to selectively attend to information in patterns that reflect their commitment to ideologically and like-defined groups, and who is the least vulnerable?
In this article, we report a curious finding about politically motivated reasoning. Data we have collected suggest that this form of reasoning appears to be negated by science curiosity.
Perhaps there is a glimmer of light at the end of the long dark tunnel after all? Kahan et al. conclude:
These two forms of evidence paint a picture—a flattering one indeed—of individuals of high science curiosity. In this view, individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected—do not turn this feature of their personality off when they engage political information but rather indulge it in that setting as well, exposing themselves more readily to information that defies their expectations about facts on contested issues. The result is that these citizens, unlike their less curious counterparts, react more open mindedly and respond more uniformly across the political spectrum to the best available evidence.
whilst Tim Harford puts it this way:
Curiosity is the seed from which sensible democratic decisions can grow. It seems to be one of the only cures for politically motivated reasoning but it’s also, into the bargain, the cure for a society where most people just don’t pay attention to the news because they find it boring or confusing.
What we need is a Carl Sagan or David Attenborough of social science — somebody who can create a sense of wonder and fascination not just at the structure of the solar system or struggles of life in a tropical rainforest, but at the workings of our own civilisation: health, migration, finance, education and diplomacy.
Do you suppose that Dan Kahan is up to that job himself? Perhaps not, since his paper signs off as follows:
As we have taken pains to emphasize, this research remains at a formative stage. As always, there are unresolved questions. The goal of this article was to report the pleasure we took in observing these surprising results in the hope that doing so would motivate other curious researchers to join us in trying to answer them.
Perhaps its time to turn to Alice?