Science Proves Nothing

Here’s the first, provocatively titled, lecture for this year’s “Politics, Perception, and Philosophy of Physics” module. This year, I plan to upload video here for each F34PPP session on a weekly schedule (although the best laid plans aft gang agley…)

Erratum: Around about the 43 minute mark I say “Polish group” when I mean “Czech group”. (Apologies to Pavel Jelinek et al.)

Down On The Upside

I stumbled across the wonderful skepticalscience.com website last night (via Ken Rice‘s Twitter feed) and just had to quickly blog about this brilliant, at-a-glance rebuttal of that hoary old “The data don’t lie” aphorism. The graph speaks for itself…

 “But Philip, I thought you’d sworn off Twitter?” I have — I killed my Twitter account almost four years ago and have not once regretted it since. For one thing, a Twitter account is not required in order to read tweets and I occasionally dip into the Twitter threads of colleagues and friends I used to follow (Ken among them) via search.twitter.com.

“We don’t need no education…”

(…or Why It Sometimes Might Be Better For Us Academics to Shut The F**k Up Occasionally.)

Boost Public Engagement to Beat Pseudoscience, says Jim Al-Khalili” goes the headline on p.19 of this week’s Times Higher Education, my traditional Saturday teatime read. The brief article, a summary of points Jim made during his talk at the Young Universities Summit, continues…

Universities must provide more opportunities for academics to engage with the public or risk allowing pseudoscience to “fill the vacuum”, according to Jim Al-Khalili.

Prof. Al-Khalili is an exceptionally talented and wonderfully engaging science communicator. I enjoy, and very regularly recommend (to students and science enthusiasts of all stripes), his books and his TV programmes. But the idea that education and academic engagement are enough to counter pseudoscience is, at the very best, misleading and, at worst, a dangerous and counter-productive message to propagate.

The academic mantra of “education, education, education” as the unqualified panacea for every socioeconomic ill, although comforting, is almost always a much too simplistic — and, for some who don’t share our ideological leanings, irritatingly condescending — approach. I’ve written enthusiastically before about Tom Nichols’ powerful “The Death of Expertise”, and I’ve lost count of the number of times that I’ve referred to David McRaney’s The Backfire Effect in previous posts and articles I’ve written. It does no harm to quote McRaney one more time…

The last time you got into, or sat on the sidelines of, an argument online with someone who thought they knew all there was to know about health care reform, gun control, gay marriage, climate change, sex education, the drug war, Joss Whedon or whether or not 0.9999 repeated to infinity was equal to one – how did it go?

Did you teach the other party a valuable lesson? Did they thank you for edifying them on the intricacies of the issue after cursing their heretofore ignorance, doffing their virtual hat as they parted from the keyboard a better person?

Perhaps you’ve been more fortunate than McRaney (and me.) But somehow I doubt it.

As just one example from McRaney’s list, there is strong and consistent evidence that, in the U.S., Democrats are much more inclined to accept the evidence for anthropogenic climate change than Republicans. That’s bad enough, but the problem of political skew in motivated rejection of science is much broader. A very similar and very distinct right-left asymmetry exists across the board, as discussed in Lewandowsky and Oberauer’s influential paper, Motivated Rejection Of Science. I’ll quote from their abstract, where they make the same argument as McRaney but in rather more academic, though no less compelling, terms [1]:

Rejection of scientific findings is mostly driven by motivated cognition: People tend to reject findings that threaten their core beliefs or worldview. At present, rejection of scientific findings by the U.S. public is more prevalent on the political right than the left. Yet the cognitive mechanisms driving rejection of science, such as the superficial processing of evidence toward the desired interpretation, are found regardless of political orientation. General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.

Let me repeat and bolden that last line for emphasis. It’s exceptionally important.


General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.


If we blithely assume that the rejection of well-accepted scientific findings — and the potential subsequent descent into the cosy embrace of pseudoscience — is simply a matter of a lack of education and engagement, we fail to recognise the complex and multi-facetted sociology and psychology at play here. Yes, we academics need to get out there and talk about the research we and others do — and I’m rather keen on doing this myself (as discussed here, here, and here) — but let’s not make the mistake that there’s always a willing audience waiting with bated breath for the experts to come and correct them on what they’re getting wrong.

I spend a lot of time on public engagement, both online and off — although not, admittedly, as much as Jim — and I’ve encountered the “motivated rejection” effect time and time again over the years. Here’s just one example of what I mean — a comment posted under the most recent Computerphile video I did with Sean Riley:

ZeroCred

The “zero credibility” comment stems not from the science presented in the video but from a reaction to my particular ideological and political leanings. For reasons I’ve discussed at length previously, I’ve been labelled as an “SJW” — a badge I’m happy to wear with quite some pride. (If you’ve not encountered the SJW perjorative previously, lucky you. Here’s a primer.) Because of my SJW leanings, the science I present, regardless of its accuracy (and level of supporting evidence/research), is immediately rejected by a subset of aggrieved individuals who do not share my political outlook. They outright dismiss the credibility or validity of the science not on the basis of the content or the strength of the data/evidence but solely on their ideological, emotional, and knee-jerk reaction to me…

Downvoting

(That screenshot above is taken from the comments section for this video.)

It’s worth noting that the small hardcore of viewers who regularly downvote and leave comments about the ostensible lack of credibility of the science I present are very often precisely those who would claim to be ever-so-rational and whose clarion call is “Facts over feels” [1]. Yet they are so opposed to my “SJW-ism” that they reject everything I say, on any topic, as untrustworthy; they cannot get beyond their gut-level emotional reaction to me.

My dedicated following of haters is a microcosm of the deep political polarisation we’re seeing online, with science caught in the slip-stream and accepted/rejected on the basis of how it appeals to a given worldview, rather than on the strength of the scientific evidence itself. (And it’s always fun to be told exactly how science works by those who have never carried out an experiment, published a paper, been a member of a peer-review panel, reviewed a grant etc.) This then begs the question: Am I, as a left-leaning academic with clearly diabolical SJW tendencies, in any position at all to educate this particular audience on any topic? Of course not. No matter how much scientific data and evidence I provide it will be dismissed out of hand because I am not of their tribe.[3]

Jim Al-Khalili’s argument at the Young Universities Summit that what’s required is ever-more education and academic engagement is, in essence, what sociologists and Science and Technology Studies (STS) experts would describe as the deficit model. The deficit model has been widely discredited because it simply does not accurately describe how we modify our views (or not) in the light of more information. (At the risk of making …And Then There’s Physics  scream, I encourage you to read their informative and entertaining posts on the theme of the deficit model.)

Prof. Al-Khalili is further reported as stating that “…to some extent, you do have to stand up and you do have to bang on about evidence and rationalism, because if we don’t, we will make the same mistakes of the past where the vacuum will be filled with people talking pseudoscience or nonsense.” 

Banging on about evidence and rationalism will have close to zero effect on ideologically opoosed audiences because they already see themselves as rational and driven by evidence [3]; they won’t admit to being biased and irrational because their bias is unconscious. And we are all guilty of succumbing to unconscious bias, to a greater or lesser extent. Force-feeding  more data and evidence to those with whom we disagree is not only unlikely to change their minds, it’s much more likely to entrench them further in their views. (McRaney, passim.)

Let me make a radical suggestion. What if we academics decided to engage rather less sometimes? After all, who is best placed to sway the position — on climate change, vaccination, healthcare, social welfare, or just about any topic — of a deeply anti-establishment Trump supporter who has fallen hook, line, and sinker for the “universities are hotbeds of cultural Marxism” meme? A liberal academic who can trot out chapter and verse from the literature, and present watertight quantitative (and qualitative) arguments ?

Of course not.

We need to connect, somehow, beyond the level of raw data and evidence. We need to appeal to that individual’s biases and psychology. And that means thinking more cannily, and more politically, about how we influence a community. Barking, or even gently reciting, facts and figures is not going to work. This is uncomfortable for any scientist, I know. But you don’t need to take my word for it — review the evidence for yourself.

The strength of the data used to support a scientific argument almost certainly won’t make a damn bit of difference when a worldview or ideology is challenged. And that’s not because our audience is uneducated. Nor are they unintelligent. They are behaving exactly as we do. They are protecting their worldview via the backfire effect.

 


[1] One might credibly argue that the rejection skew could lean the other way on certain topics such as the anti-vaccination debate, where anecdotal, and other, evidence might suggest that there is a stronger liberal/left bias. It turns out that even when it comes to anti-vaxxers, there is quite a considerable amount of data to support that it’s the right that has a higher degree of anti-science bias [2]. Here’s one key example: Trust In Scientists On Climate Change and Vaccines, LC Hamilton, J Hartter, and K Saito,  SAGE Open, July – Sept 2015, 1 – 13. See also Beyond Misinformation, S. Lewandowsky, U. K. H. Ecker, and J. Cook, J. Appl. Res. Memory. Cogn. 6 353 (2017) for a brief review of some of the more important literature on this topic.

[2] …but then it’s all lefty, liberal academics writing these papers, right? They would say that.

[3] Here’s an amusing recent example of numerological nonsense being passed off as scientific reasoning. Note that Peter Coles’ correspondent claims that the science is on his side. How persuasive do you think he’ll find Peter’s watertight, evidence-based reasoning to be? How should he be further persauded? Will more scientific evidence and data do the trick?

 

If it seems obvious, it probably isn’t

…And Then There’s Physics’ post on science communication, reblogged below, very much struck a chord with me. This point, in particular, is simply not as widely appreciated as it should be:

“Maybe what we should do more of is make it clear that the process through which we develop scientific knowledge is far more complicated than it may, at first, seem.”

There can too often be a deep-seated faith in the absolute objectivity and certainty of “The Scientific Method”, which possibly stems (at least in part) from our efforts to not only simplify but to “sell” our science to a wide audience. The viewer response to a Sixty Symbols video on the messiness of the scientific process, “Falsifiability and Messy Science”, brought this home to me: The Truth, The Whole Truth, and Nothing But…

(…but I’ve worried for a long time that I’ve been contributing to exactly the problem ATTP describes: Guilty Confessions of a YouTube Physicist)

By the way, if you’re not subscribed to ATTP’s blog, I heartily recommend that you sign up right now.

...and Then There's Physics

There’s an interesting paper that someone (I forget who) highlighted on Twitter. It’a about when science becomes too easy. The basic idea is that there are pitfalls to popularising scientific information.

Compared to experts,

laypeople have not undergone any specialized training in a particular domain. As a result, they do not possess the deep-level background knowledge and relevant experience that a competent evaluation of science-related knowledge claims would require.

However, in the process of communicating, and popularising, science, science communicators tend to provide simplified explanations of scientific topics that can

lead[s] readers to underestimate their dependence on experts and conclude that they are capable of evaluating the veracity, relevance, and sufficiency of the contents.

I think that this is an interesting issue and it partly what motivated my post about public involvement in science.

However, I am slightly uneasy about this general framing. I think everyone is a…

View original post 449 more words

Beauty and the Biased

A big thank you to Matin Durrani for the invitation to provide my thoughts on the Strumia saga — see “The Worm That (re)Turned” and “The Natural Order of Things?” for previous posts on this topic — for this month’s issue of Physics World. PW kindly allows me to make the pdf of the Opinion piece available here at Symptoms. The original version (with hyperlinks intact) is also below.

(And while I’m at it, an even bigger thank you to Matin, Tushna, and all at PW for this immensely flattering (and entirely undeserved, given the company I’m in) accolade…


From Physics World, Dec. 2018.

A recent talk at CERN about gender in physics highlights that biases remain widespread, Philip Moriarty says we need to do more to tackle such issues head on

When Physics World asked several physicists to name their favourite books for the magazine’s 30th anniversary issue, I knew immediately what I would choose (see October pp 74-78). My “must-read” pick was Sabine Hossenfelder’s exceptionally important Lost In Math: How Beauty Leads Physics Astray, which was released earlier this year.

Hossenfelder, a physicist based at the Frankfurt Institute of Technology, is an engaging and insightful writer who is funny, self-deprecating, and certainly not afraid to give umbrage. I enjoyed the book immensely, being taken on a journey through modern theoretical physics in which Hossenfelder attempts to make sense of her profession. If there is one chapter of the book that particularly resonated with me it’s the concluding Chapter 10, “Knowledge is Power”. This is a powerful closing statement that deserves to be widely read by all scientists, but especially by that especially irksome breed of physicist who believes — when all evidence points to the contrary — that they are somehow immune to the social and cognitive biases that affect every other human.

In “Knowledge is Power”, Hossenfelder adeptly outlines the primary biases that all good scientists have striven to avoid ever since the English philosopher Francis Bacon identified his “idols of the tribe” – i.e. the tendency of human nature to prefer certain types of incorrect conclusions. Her pithy single-line summary at the start of the chapter captures the key issue: “In which I conclude the world would be a better place if everyone listened to me”.

Lost in bias

Along with my colleague Omar Almaini from the University of Nottingham, I teach a final-year module entitled “The Politics, Perception, and Philosophy of Physics”. I say teach, but in fact, most of the module consists of seminars that introduce a topic for students to then debate, discuss and argue for the remaining time. We dissect Richard Feynman’s oft-quoted definition of science: “Science is the belief in the ignorance of experts”.  Disagreeing with Feynman is never a comfortable position to adopt, but I think he does science quite a disservice here. The ignorance, and sometimes even the knowledge, of experts underpins the entire scientific effort. After all, collaboration, competition and peer review are the lifeblood of what we do. With each of these come complex social interactions and dynamics and — no matter how hard we try — bias. For this and many other reasons, Lost In Math is now firmly on the module reading list.

At a CERN workshop on high-energy theory and gender at the end of September, theoretical physicist Alessandro Strumia from the University of Pisa claimed that women with fewer citations were being hired over men with greater numbers of citations. Following the talk, Strumia faced an immediate backlash in which CERN suspended him pending an investigation, while some 4000 scientists signed a letter that called his talk “disgraceful”. Strumia’s talk was poorly researched, ideologically-driven, and an all-round embarrassingly biased tirade against women in physics. I suggest that Strumia needs to take a page — or many — out of Hossenfelder’s book. I was reminded of her final chapter time and time again when I read through Strumia’s cliché-ridden and credulous arguments, his reactionary pearl-clutching palpable from almost every slide of his presentation.

One criticism that has been levelled at Hossenfelder’s analysis is that it does not offer solutions to counter the type of biases that she argues are prevalent in the theoretical-physics community and beyond. Yet Hossenfelder does devote an appendix — admittedly rather short — to listing some pragmatic suggestions for tackling the issues discussed in the book. These include learning about, and thus tackling, social and cognitive biases.

This is all well and good, except that there are none so blind as those that will not see. The type of bias that Strumia’s presentation exemplified is deeply engrained. In my experience, his views are hardly fringe, either within or outside the physics community — one need only look to the social media furore over James Damore’s similarly pseudoscientific ‘analysis’ of gender differences in the context of his overwrought “Google Manifesto” last year. Just like Damore, Strumia is being held up by the usual suspects as the ever-so-courageous rational scientist speaking “The Truth”, when, of course, he’s entirely wedded to a glaringly obvious ideology and unscientifically cherry-picks his data accordingly. In a masterfully acerbic and exceptionally timely blog post published soon after the Strumia storm broke (“The Strumion. And On”), his fellow particle physicist Jon Butterworth (UCL) highlighted a number of the many fundamental flaws at the core of Strumia’s over-emotional polemic.   .

Returning to Hossenfelder’s closing chapter, she highlights there that the “mother of all biases” is the “bias blind spot”, or the insistence that we certainly are not biased:

“It’s the reason my colleagues only laugh when I tell them biases are a problem, and why they dismiss my ‘social arguments’, believing they are not relevant to scientific discourse,” she writes. “But the existence of those biases has been confirmed in countless studies. And there is no indication whatsoever that intelligence protects against them; research studies have found no links between cognitive ability and thinking biases.”

Strumia’s diatribe is the perfect example of this bias blind spot in action. His presentation is also a case study in confirmation bias. If only he had taken the time to read and absorb Hossenfelder’s writing, Strumia might well have saved himself the embarrassment of attempting to pass off pseudoscientific guff as credible analysis.

While the beauty of maths leads physics astray, it is ugly bias that will keep us in the dark.

 

Is Science Self-Correcting? Some Real World-Examples From Psychological Research.

…or The Prognosis Is Not Good, Psychology. It’s A Bad Case Of Physics Envy*

Each year there are two seminars for the Politics, Perception, and Philosophy of Physics module that are led by invited speakers. First up this year was the enlightening, engaging, and entertaining Nick Brown, who, and I quote from no less a source than The Guardian, has an “astonishing story…[he] began a part-time psychology course in his 50s and ended up taking on America’s academic establishment.”

I recommend you read that Guardian profile in full to really get the measure of Mr. (soon to be Dr.) Brown but, in brief, he has played a central role in exposing some of the most egregious examples of breathtakingly poor, or downright fraudulent, research in psychology, a field that needs to get its house in order very soon. (A certain high profile professor of psychology who is always very keen to point the finger at what he perceives to be major failings in other disciplines should bear this in mind and heed his own advice. (Rule #6, as I recall…))

Nick discussed three key examples of where psychology research has gone badly off the rails:

    • Brian Wansink, erstwhile director of Cornell’s Food and Brand Lab, whose research findings (cited over 20,000 times) have been found to be rather tough to digest given that they’re riddled with data manipulation and resulted from other far-from-robust research practices.
    • The “audacious academic fraud” of Diederik Stapel. (Nick is something of a polymath, being fluent in Dutch among other skills, and translated Stapel’s autobiography/confession, making it freely available online. I strongly recommend adding Stapel’s book to your “To Read” list; I found it a compelling story that provides a unique insight into the mindset and motivations of someone who fakes their research. Seeing the ostracisation and shaming through Stapel’s eyes was a profoundly affecting experience and I found myself sympathising with the man, especially with regard to the effects of his fraud on his family.)

It was a great pleasure to host Nick’s visit to Nottingham (and to finally meet him after being in e-mail contact on and off for about eighteen months). Here’s his presentation…

*But don’t worry, you’re not alone.

** Hmmm. More psychologists with a chaotic concept of chaos. I can see a pattern emerging here. Perhaps it’s fractal in nature…


 

Update 18/11/2018. 15:30. I am rapidly coming to the opinion that in the dismal science stakes, psychology trumps economics by quite some margin. I’ve just read Catherine Bennett’s article in The Observer today on a research paper that created a lot of furore last week: “Testing the Empathizing-Systemizing theory of sex differences and the Extreme Male Brain theory of autism in half a million people“, a study which, according to a headline in The Times (amongst much other similarly over-excited and credulous coverage) has shown that male and female brains are very different indeed.

One would get the impression from the headlines that the researchers must have carried out an incredibly systematic and careful fMRI study, which, given the sample size, in turn must have taken decades and involved highly sophisticated data analysis techniques.

Nope.

They did their research by…asking people to fill in questionnaires.

Bennett highlights Dean Burnett ‘s incisive demolition of the paper and surrounding media coverage. I thoroughly recommend Burnett’s post – he highlights a litany of issues with the study (and others like it). For one thing, the idea that self-reporting via questionnaire can provide a robust objective analysis of just about any human characteristic or trait is ludicrously simple-minded. Burnett doesn’t cover all of the issues because, as he says at the end of his post: “There are other concerns to raise of course, but I’ll keep them in reserve for when the next study that kicks this whole issue off again is published. Shouldn’t be more than a couple of months.

Indeed.

Bullshit and Beyond: From Chopra to Peterson

Harry G Frankfurt‘s On Bullshit is a modern classic. He highlights the style-over-substance tenor of the most fragrant and flagrant bullshit, arguing that

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says
only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye
is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

In other words, the bullshitter doesn’t care about the validity or rigour of their arguments. They are much more concerned with being persuasive. One aspect of BS that doesn’t quite get the attention it deserves in Frankfurt’s essay, however, is that special blend of obscurantism and vacuity that is the hallmark of three world-leading bullshitters of our time:  Deepak Chopra, Karen Barad (see my colleague Brigitte Nerlich’s important discussion of Barad’s wilfully impenetrable language here), and Jordan Peterson. In a talk for the University of Nottingham Agnostic, Secularist, and Humanist Society last night (see here for the blurb/advert), I focussed on the intriguing parallels between their writing and oratory. Here’s the video of the talk.

Thanks to UNASH for the invitation. I’ve not included the lengthy Q&A that followed (because I stupidly didn’t ask for permission to film audience members’ questions). I’m hoping that some discussion and debate might ensue in the comments section below. If you do dive in, try not to bullshit too much…