The science of scientific denial

The internet and the wealth of expertise it collates should have killed off scientific denial. If I wilfully believed that the Earth is flat, a quick Google search would swiftly dispossess me of that misconception with hard irreprovable facts.

And yet, often the opposite is true. Dig deep enough and you’ll find plenty of loud voices arguing strongly against the mainstream scientific consensus, with varying degrees of sense. It’s pretty harmless in most cases, but what about situations where it really matters? All over the world,

people who fervently believe vaccinations are unnecessary are actually causing diseases we’d eradicated to come back, while climate change – almost universally believed to be affected by humans amongst scientists – continues to have noisy doubters preventing the progress necessary to cancel out the worst of its effects. Why does this happen in the face of overwhelming supporting evidence?

One view is that scientific threats are often abstract and too vague to respond to. Robert Gifford authored a 2011 paper outlining seven psychological barriers (or “dragons”) that stand in the way of serious action against climate change, including ideologies (“it’s wrong for the state to intervene”), social comparisons (“why should I change when he won’t?”) and ignorance of the problem (“climate what?”).

An abstract threat

Another suggestion made by Robert Gifford in the 2010 talk embedded below is that climate change is such a slow threat that it doesn’t register as a serious pressing danger. You know the scientifically dubious theory of frogs not jumping out of slowly heated water and boiling alive because they don’t sense the threat? It’s like that.

Another take on this is a little more cynical: it’s all about self-preservation. Or rather, preservation of our way of life. Accepting that climate change is real and happening, and deciding you want to do something about it, can change the way we live – either by government diktat (carbon credits, emission caps, etc.) or by personal moral choice. Eating less (or no) meat, or foregoing air travel could make someone’s quality of life appear drastically worse. It’d be more convenient if the science was wrong and there was no need or point in changing, right? That was one of the depressing conclusions reached from testing Swiss focus groups.polar_bear_protest

In a recent Reddit Ask Me Anything, John Cook, author of Climate Change Denial: Heads in the Sand, explained his own findings back this up. “The main driver of climate science denial is political ideology. Some people don’t like the solutions to climate change that involve regulation of polluting industries. Not liking the solutions, they deny there’s a problem in the first place.”

Lawyers, not scientists

A first step of this is confirmation bias – a psychological concept where people tend to treat evidence that backs up their argument as more legitimate, accurate or important than the evidence against. When it comes to rational thought, to quote Jonathan Haidt, “we think we are scientists discovering the truth, but actually we are lawyers arguing for positions we arrived at by other means.”

Okay, fine, people don’t want to believe, and they can convince themselves – deliberately or not – that they don’t have to. But what if something happens to prove them completely wrong? Surely that would shake the belief?

“We think we’re scientists discovering the truth, but actually we’re lawyers arguing for positions we arrived at by other means.” – Jonathan Haidt.

Not necessarily. In 1964, psychologists Leon Festinger, Henry Riecken and Stanley Schachter wrote When Prophecy Fails – a book that covered a Chicago cult known as the Seekers, who believed an apocalypse was coming on 21 December 1954.

Many had quit jobs, left partners and given away their possessions in order to board a flying saucer that the prophecy had foretold would come to take them away.

The psychologists studied the Seekers’ reactions when the prediction failed to arrive, and while you might expect them to accept their belief had been misplaced, in fact the opposite was true: they doubled down and changed their message – the prophecy was real, it’s just that their faith had ensured God’s mercy and the Earth had been saved. They had sunk too much into the faith to give it up, and were in for the long haul.vaccine

The internet’s echo chamber

The internet subtly makes this stubbornness worse. Not only are extreme and inaccurate views easy to come by in supposed research, but if you happen to be ill-informed on something, the chances are your social network of choice isn’t going to be the one to challenge you. In Eli Pariser’s TED talk he speaks about “filter bubbles” and the impact they can have on your worldview, by surrounding you with virtual “yes men”.

This ‘echo chamber effect’ can have a strong influence on people’s perceptions, making once outlandish opinions seem commonplace, and reinforcing one’s own worldview as correct and representative. To present an example outside of science, plenty of left-wing voters were astonished that the Conservatives won the 2015 general election, when all their friends on Twitter had given the illusion of a popular turning against the government.

But if our social networks are so keen to push us towards websites and people who echo our worldview, then why are internet commenters so vicious? You may notice that articles about climate change tend to bring out the most hostile commenters – that isn’t a coincidence.

Artificial authority astroturf

It’s widely established that entire careers have been constructed around “influencing people’s opinions”, with companies overloading contentious comment threads with their side of the story. Astroturfing, as it’s known (derived from grassroots campaigns that are entirely artificial), is actually illegal in the UK and has been for some time, but it’s one of those crimes that’s nigh on impossible to prove. It happens with political parties and companies promoting their products, but it also in science: with GMOs, homeopathy, anti-vaxxers and climate change.

You can spot certain common techniques and get an eye for it eventually, but this account from a former-astroturfer sums it up pretty soundly:

“It sounds pretty crude, but the evidence suggests that, worryingly, it can work.”

If a poster wrote something close to “X,” we were supposed to respond with something close to “Y.” ‘You have to mix it up a bit, though,’ said my trainer. ‘Otherwise it gets too obvious. Learn to use a thesaurus.’ This section also contained a number of hints for derailing conversations that went too far away from what we were attempting.”

It sounds pretty crude, but the evidence suggests that, worryingly, it can work. A 2011 study in the Journal of Business Ethics found that students exposed to astroturfing websites on climate change, racism and fair trade found themselves less sure about the issues than they were when they began. The effect was particularly strong on those with open minds, as you’d expect, but even experts in their areas found themselves less certain. The kicker is that this was the case even though the students didn’t trust the source or the information. In other words: even the least impressive argument can make us question our judgements.

Clouding the issueskyline

“Shouldn’t the malleability of our brains make it easier for scientific authorities to get their message across too?”

In the case of climate change deniers, that’s all that’s required. As Cook says, “The one advantage that climate science denial has is that all that needs to be done to delay action on climate change is to foster doubt and confusion. To achieve this, they don’t have to provide an alternative, coherent position – they just have to cast doubt on the overwhelming body of evidence that humans are causing global warming.” Mission accomplished.

But shouldn’t that cut both ways? Shouldn’t the malleability of our brains make it easier for scientific authorities to get their message across too? To a degree, and that’s one of the reasons why we’re approaching a global consensus. But if people are really stubborn, in the ways outlined at the start of the article, you’re on a hiding to nothing. A few years ago, I wrote a piece for Wired about changing minds. The take home: it’s incredibly difficult. There are studies that say that presenting facts makes people more sure of their original assertion, the less we know about a subject, the more strongly we feel, and most depressingly of all, that people keep debunked ideas in their head because it’s easier than correcting them.

In short, maddening as it is, there isn’t a great deal you can do when you see scientific denialism at work, except have faith that it will die out in the long run. Hopefully, in the case of climate change, before the planet does.

Images: Melinda Mendez, Tavis Ford, Kevan, Steven Depolo, Kevin Dooley used under Creative Commons

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.