(…or Why It Sometimes Might Be Better For Us Academics to Shut The F**k Up Occasionally.)
“Boost Public Engagement to Beat Pseudoscience, says Jim Al-Khalili” goes the headline on p.19 of this week’s Times Higher Education, my traditional Saturday teatime read. The brief article, a summary of points Jim made during his talk at the Young Universities Summit, continues…
Universities must provide more opportunities for academics to engage with the public or risk allowing pseudoscience to “fill the vacuum”, according to Jim Al-Khalili.
Prof. Al-Khalili is an exceptionally talented and wonderfully engaging science communicator. I enjoy, and very regularly recommend (to students and science enthusiasts of all stripes), his books and his TV programmes. But the idea that education and academic engagement are enough to counter pseudoscience is, at the very best, misleading and, at worst, a dangerous and counter-productive message to propagate.
The academic mantra of “education, education, education” as the unqualified panacea for every socioeconomic ill, although comforting, is almost always a much too simplistic — and, for some who don’t share our ideological leanings, irritatingly condescending — approach. I’ve written enthusiastically before about Tom Nichols’ powerful “The Death of Expertise”, and I’ve lost count of the number of times that I’ve referred to David McRaney’s The Backfire Effect in previous posts and articles I’ve written. It does no harm to quote McRaney one more time…
The last time you got into, or sat on the sidelines of, an argument online with someone who thought they knew all there was to know about health care reform, gun control, gay marriage, climate change, sex education, the drug war, Joss Whedon or whether or not 0.9999 repeated to infinity was equal to one – how did it go?
Did you teach the other party a valuable lesson? Did they thank you for edifying them on the intricacies of the issue after cursing their heretofore ignorance, doffing their virtual hat as they parted from the keyboard a better person?
Perhaps you’ve been more fortunate than McRaney (and me.) But somehow I doubt it.
As just one example from McRaney’s list, there is strong and consistent evidence that, in the U.S., Democrats are much more inclined to accept the evidence for anthropogenic climate change than Republicans. That’s bad enough, but the problem of political skew in motivated rejection of science is much broader. A very similar and very distinct right-left asymmetry exists across the board, as discussed in Lewandowsky and Oberauer’s influential paper, Motivated Rejection Of Science. I’ll quote from their abstract, where they make the same argument as McRaney but in rather more academic, though no less compelling, terms :
Rejection of scientific findings is mostly driven by motivated cognition: People tend to reject findings that threaten their core beliefs or worldview. At present, rejection of scientific findings by the U.S. public is more prevalent on the political right than the left. Yet the cognitive mechanisms driving rejection of science, such as the superficial processing of evidence toward the desired interpretation, are found regardless of political orientation. General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.
Let me repeat and bolden that last line for emphasis. It’s exceptionally important.
General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.
If we blithely assume that the rejection of well-accepted scientific findings — and the potential subsequent descent into the cosy embrace of pseudoscience — is simply a matter of a lack of education and engagement, we fail to recognise the complex and multi-facetted sociology and psychology at play here. Yes, we academics need to get out there and talk about the research we and others do — and I’m rather keen on doing this myself (as discussed here, here, and here) — but let’s not make the mistake that there’s always a willing audience waiting with bated breath for the experts to come and correct them on what they’re getting wrong.
I spend a lot of time on public engagement, both online and off — although not, admittedly, as much as Jim — and I’ve encountered the “motivated rejection” effect time and time again over the years. Here’s just one example of what I mean — a comment posted under the most recent Computerphile video I did with Sean Riley:
The “zero credibility” comment stems not from the science presented in the video but from a reaction to my particular ideological and political leanings. For reasons I’ve discussed at length previously, I’ve been labelled as an “SJW” — a badge I’m happy to wear with quite some pride. (If you’ve not encountered the SJW perjorative previously, lucky you. Here’s a primer.) Because of my SJW leanings, the science I present, regardless of its accuracy (and level of supporting evidence/research), is immediately rejected by a subset of aggrieved individuals who do not share my political outlook. They outright dismiss the credibility or validity of the science not on the basis of the content or the strength of the data/evidence but solely on their ideological, emotional, and knee-jerk reaction to me…
(That screenshot above is taken from the comments section for this video.)
It’s worth noting that the small hardcore of viewers who regularly downvote and leave comments about the ostensible lack of credibility of the science I present are very often precisely those who would claim to be ever-so-rational and whose clarion call is “Facts over feels” . Yet they are so opposed to my “SJW-ism” that they reject everything I say, on any topic, as untrustworthy; they cannot get beyond their gut-level emotional reaction to me.
My dedicated following of haters is a microcosm of the deep political polarisation we’re seeing online, with science caught in the slip-stream and accepted/rejected on the basis of how it appeals to a given worldview, rather than on the strength of the scientific evidence itself. (And it’s always fun to be told exactly how science works by those who have never carried out an experiment, published a paper, been a member of a peer-review panel, reviewed a grant etc.) This then begs the question: Am I, as a left-leaning academic with clearly diabolical SJW tendencies, in any position at all to educate this particular audience on any topic? Of course not. No matter how much scientific data and evidence I provide it will be dismissed out of hand because I am not of their tribe.
Jim Al-Khalili’s argument at the Young Universities Summit that what’s required is ever-more education and academic engagement is, in essence, what sociologists and Science and Technology Studies (STS) experts would describe as the deficit model. The deficit model has been widely discredited because it simply does not accurately describe how we modify our views (or not) in the light of more information. (At the risk of making …And Then There’s Physics scream, I encourage you to read their informative and entertaining posts on the theme of the deficit model.)
Prof. Al-Khalili is further reported as stating that “…to some extent, you do have to stand up and you do have to bang on about evidence and rationalism, because if we don’t, we will make the same mistakes of the past where the vacuum will be filled with people talking pseudoscience or nonsense.”
Banging on about evidence and rationalism will have close to zero effect on ideologically opoosed audiences because they already see themselves as rational and driven by evidence ; they won’t admit to being biased and irrational because their bias is unconscious. And we are all guilty of succumbing to unconscious bias, to a greater or lesser extent. Force-feeding more data and evidence to those with whom we disagree is not only unlikely to change their minds, it’s much more likely to entrench them further in their views. (McRaney, passim.)
Let me make a radical suggestion. What if we academics decided to engage rather less sometimes? After all, who is best placed to sway the position — on climate change, vaccination, healthcare, social welfare, or just about any topic — of a deeply anti-establishment Trump supporter who has fallen hook, line, and sinker for the “universities are hotbeds of cultural Marxism” meme? A liberal academic who can trot out chapter and verse from the literature, and present watertight quantitative (and qualitative) arguments ?
Of course not.
We need to connect, somehow, beyond the level of raw data and evidence. We need to appeal to that individual’s biases and psychology. And that means thinking more cannily, and more politically, about how we influence a community. Barking, or even gently reciting, facts and figures is not going to work. This is uncomfortable for any scientist, I know. But you don’t need to take my word for it — review the evidence for yourself.
The strength of the data used to support a scientific argument almost certainly won’t make a damn bit of difference when a worldview or ideology is challenged. And that’s not because our audience is uneducated. Nor are they unintelligent. They are behaving exactly as we do. They are protecting their worldview via the backfire effect.
 One might credibly argue that the rejection skew could lean the other way on certain topics such as the anti-vaccination debate, where anecdotal, and other, evidence might suggest that there is a stronger liberal/left bias. It turns out that even when it comes to anti-vaxxers, there is quite a considerable amount of data to support that it’s the right that has a higher degree of anti-science bias . Here’s one key example: Trust In Scientists On Climate Change and Vaccines, LC Hamilton, J Hartter, and K Saito, SAGE Open, July – Sept 2015, 1 – 13. See also Beyond Misinformation, S. Lewandowsky, U. K. H. Ecker, and J. Cook, J. Appl. Res. Memory. Cogn. 6 353 (2017) for a brief review of some of the more important literature on this topic.
 …but then it’s all lefty, liberal academics writing these papers, right? They would say that.
 Here’s an amusing recent example of numerological nonsense being passed off as scientific reasoning. Note that Peter Coles’ correspondent claims that the science is on his side. How persuasive do you think he’ll find Peter’s watertight, evidence-based reasoning to be? How should he be further persauded? Will more scientific evidence and data do the trick?