Standards at Cambridge just ain’t what they used to be…

I’ve been swamped with the day job of late so my rate of blogging has accordingly dropped substantially. But I woke up this morning and blearily-eyed checked my Outlook inbox, to find, nestling between the usual spam conference invitations from predatory publishers [1], an e-mail about this Guardian article: Cambridge University rescinds Jordan Peterson invitation. (Thanks, Lori. Peterson to wake up to at 6:00 am. You’re too kind.) And I just can’t let this go without a quick post before I get back to the e-mail backlog.

Just what the hell was Cambridge thinking?

Peterson’s pathetically transparent, overwrought, and highly lucrative “anti-PC” crusades are of course entirely at odds with the ethos of Cambridge, and the university’s staff and students quickly and forcefully pointed this out. [2]

But what I can’t get my head around is how and why the invitation to Peterson was made in the first place. One would hope that Cambridge of all places would very carefully consider and vet the scholarship of any visiting fellow. Fellowships are generally exceptionally difficult to secure. Did no-one involved with inviting Peterson take the time to read and assess his writings and witterings?

This, for example…

(from his, um, “seminal” Maps Of Meaning.)

Cambridge took that seriously? Over the years, I’ve received green ink letters and e-mails that rank at the top of the Baez scale that make much more sense.

Or what about Peterson’s lobster nonsense, as, for example, forensically dissected by Bailey Steinworth, a third year PhD student researcher, in her masterful take-down last year? Here’s Steinworth’s closing argument. (I urge you to read the entire piece.)

“No biologist would argue with Peterson that dominance hierarchies have probably existed for a long time, but it’s also true that plenty of animals live together without the need to assert dominance over one another. It seems as if his discussion of lobsters illustrates far more about his own worldview than it does about human behavior, but he’s the psychologist, not me. “

Peterson’s lobster fixation is a fantastic example of what Feynman described as Cargo Cult science — all of the hallmarks of science but lacking the essential objectivity and self-critical reasoning.  But yet this level of “scholarship” is good enough to warrant a visiting fellowship at one of Britain’s most august seats of learning?

And the less said about Peterson’s wilfully uninformed playing to the gallery when it comes to climate change, the better.

It takes a minimal amount of background reading about Peterson to discern the “Emperor’s New Clothes” character of his appeal. It’s rather depressing that academics of the calibre of those who lecture in the hallowed halls of Cambridge couldn’t manage this modicum of research. As a starting point, I thoroughly recommend Nathan J. Robinson’s profile of Peterson: “The Intellectual We Deserve“. Or for a rather more pithy insight into Peterson’s style-over-substance shtick, Private Eye nailed it in this parody.

It’s very worrying indeed that the standard of scholarship required of visiting academics at what is arguably Britain’s most prestigious university [3] has slipped this low [4]. 


[1] These somehow always seem to make it through Nottingham’s otherwise rather gung-ho spam filter…

[2] Peterson will be rubbing his hands with glee at the news that his invitation has been rescinded. What better example of the “PC orthodoxy”/cultural Marxists/leftist snowflakes/ (…insert tiresome cliche of choice...)  clamping down on his free speech could there be? He’ll dine out on this for quite some time.

[3] Settle down, Oxford.

[4] However, Cambridge — or, at least, its associated publisher, Cambridge University Press — has form when it comes to pseudoscientific woo.

Given a good Hyding…

Marina Hyde is on wonderfully acerbic form in today’s Guardian, masterfully knocking Prof. Peterson’s polemic down a peg or two…

What’s particularly delicious, however, is that after Hyde highlights Peterson’s humourless, po-faced, “woe is me(n)” shtick, the comments section lights up with, you guessed it, humourless, po-faced Peterson disciples whining about the lack of intellectual rigour in the article. An article published in the, um, “Lost In Showbiz” column…

Let’s close with a verse from the Good Book. I think that Rule #9 is especially apposite: “Assume that the person you are listening to might know something you don’t.

 

 

Bullshit and Beyond: From Chopra to Peterson

Harry G Frankfurt‘s On Bullshit is a modern classic. He highlights the style-over-substance tenor of the most fragrant and flagrant bullshit, arguing that

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says
only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye
is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

In other words, the bullshitter doesn’t care about the validity or rigour of their arguments. They are much more concerned with being persuasive. One aspect of BS that doesn’t quite get the attention it deserves in Frankfurt’s essay, however, is that special blend of obscurantism and vacuity that is the hallmark of three world-leading bullshitters of our time:  Deepak Chopra, Karen Barad (see my colleague Brigitte Nerlich’s important discussion of Barad’s wilfully impenetrable language here), and Jordan Peterson. In a talk for the University of Nottingham Agnostic, Secularist, and Humanist Society last night (see here for the blurb/advert), I focussed on the intriguing parallels between their writing and oratory. Here’s the video of the talk.

Thanks to UNASH for the invitation. I’ve not included the lengthy Q&A that followed (because I stupidly didn’t ask for permission to film audience members’ questions). I’m hoping that some discussion and debate might ensue in the comments section below. If you do dive in, try not to bullshit too much…

 

 

When is a skeptic not a skeptic?

I’m looking forward to giving this talk for the UoN Agnostic, Secularist and Humanist (UNASH) society (“Think Rationally, Act Compassionately“) on Wednesday…

UNASH.jpg

The ‘blurb’ is as follows…

Everyone is a sceptic these days. The death of expertise, as described so compellingly by Tom Nicholls in his recent book, has unleashed a tsunami of wilfully uninformed ‘critiques’ of everything from the shape of the Earth to the ability of women to do physics. This toxic blend of ignorance, arrogance, and unblinking credulity now fuels a very significant fraction of internet bandwidth. A little learning is indeed a dangerous thing.

In this talk, I’ll focus on the thorny problem of just how we counter the type of scepticism that brought the world Pizzagate, the ‘truth’ about 9-11, and an ever-expanding set of ever-more-ludicrous conspiracy theories. On the way, we’ll consider the style-over-substance rhetoric and pseudo-scepticism that internet gurus like Deepak Chopra and Jordan B Peterson exploit to woo uncritical audiences (of self-proclaimed sceptics.)

I’m hoping that some robust discussion and debate will ensue…

Private Eye parodies Peterson’s purple prose

If you can’t dazzle them with brilliance, baffle them with bullshit.

You can fool some of the people some of the time – and that’s enough to make a decent living

WC Fields (1880-1946)


I got the new issue of Private Eye today and was tickled by their effortless lampooning of Jordan Peterson‘s tediously overwrought writing. Just like the worst of the postmodernists he so despises (and Alan Sokal so memorably ridiculed),  Peterson’s needlessly ornate, florid and flaccid prose is a triumph of (poor) style over substance…

Peterson_small.jpg

“…it is a wise rule and good rule to hold back from skating when there is no ice visible on the water.”

Indeed. And thus endeth the lesson.

 

The truth, the whole truth, and nothing but…

This video, which Brady Haran uploaded for Sixty Symbols back in May, ruffled a few feathers…

I’ve been meaning to find time to address some of the very important and insightful points that were raised in the discussions under the video, but I’ve been …

Errrm. Sorry. Hang on just one minute. “Very important and insightful points” you say? Under a YouTube video? Yeah, right…

Believe me, I fully appreciate your entirely justified scepticism here but, yes, if you scroll past the usual dose of grammatically-garbled, content-free boilerplate from the more cerebrally challenged, you’ll find that the comments section contains a considerable number of points that are entirely worthy of discussion. In fact, I’m going to be using some of those YouTube comments to prompt debate during the Politics, Perception and Philosophy of Physics (PPP) module that my colleague Omar Almaini and I run in the autumn semester.

Before I get into considering specific comments, however, I’ll just take a brief moment to highlight a central theme “below the line” of that video, viz. the absolute faith in the trustworthiness and reliability of the scientific method. Or, more accurately, the monolith that is The Scientific Method. Many who contribute to that comments section are utterly convinced that The Truth, however that might be defined, will always win out against the inherent messiness of the scientific process. Well, maybe. Possibly. But on what time scale? And with what implications for the progress of science in the meantime? Wedded entirely to their ideology without ever presenting any evidence to support their case, they are completely convinced that they know exactly how science works. Often without ever doing science themselves. This is hardly the most scientific of approaches.

OK, deep breath. I’m going in. Let’s delve into the comments section…

IntellectuallyDifficult

The idea that science progresses as a nice linear, objective process from hypothesis to “fact” is breathtakingly naive. Unfortunately, it’s exceptionally difficult for some to countenance, within their rather rigid worldview and mindset, that science could ever be inherently messy and uncertain. As “Ali Syed” notes above, this can indeed lead to quite some intellectual indigestion for some…

cavalrycome

cavalrycome” here helpfully serves up a key example of that breathtaking naivety in action. The idea that testing scientific theories doesn’t depend on social factors and serendipity shows a deep and touching faith — and I use that word advisedly — in the tenets of The Scientific Method. “Just do the experiment” is the mantra. Or, as Feynman put it,

 If it disagrees with experiment it is wrong. In that simple statement is the key to science. It does not make any difference how beautiful your guess is. It does not make any difference how smart you are, who made the guess, or what his name is – if it disagrees with experiment it is wrong. That is all there is to it.

(I guess it goes without saying that, as is the case for so many physicists, Feynman is a bit of a hero of mine).

…all well and good, except that doing the experiment simply isn’t enough. The same experimental data can be (mis)interpreted by different scientists in many ways. I could point to very many examples but let’s choose one that hits close to home for me.

Along with colleagues in Liverpool, Nottingham, and Tsukuba, I spent a considerable amount of my time a few years back embroiled in a critique of scanning probe microscope (SPM) images of so-called ‘stripy’ nanoparticles. I am not about to open that can of worms again. (Life is too short). For an overview, see this post.

Without going into detail, the key point is this: we had our interpretation of the data, and the group whose work we critiqued had theirs. On more than one occasion, the fact that their interpretation had been previously published and regularly cited was used to justify their position. (I thoroughly recommend Neuroskeptic’s post on the central role of data interpretation in science. And this follow-up post.)

The testing, publication and critique of experimental (or theoretical) data fundamentally involves the scientific community at many levels. First of all, there’s the sociology of the peer review process itself. What has been previously published? Do our results agree with that previously published work? If not, can we convince the editors and referees of the validity of our data? Then there’s the question of the “impact” and excitement of the science in question. Is the work newsworthy? Will it make it to the glossy cover of the journal? Will it help secure the postdoc a lectureship or a tenure-track position?

Moreover, science requires funding.  Testing a particular theory may well require a few million quid of experimental kit, consumables, and/or staff resources. That funding is allocated via peer review. And peer review is notoriously hit and miss. I’ve seen exactly the same proposal be rejected by one funding panel and funded by another. On more than one occasion. Having the right person speak for your grant proposal at a prioritisation panel meeting can make all the difference when it comes to success in funding. (But don’t just take my word for it when it comes to how peer review (mis)steers the scientific process — a minute or two on Google is all you need to find key examples.)

Let’s complement that nanoparticle example above with some science involving rather larger length scales.  Following one of the PPP sessions last year, Omar pointed me towards an illuminating blog post by Ed Hawkins on uncertainty estimates in the measurement of the Hubble constant. Here are the key data (taken from RP Kirschner, PNAS 101 8 (2004)):

F2.large.jpg

Note the evolution of the Hubble constant towards its currently accepted value. Feynman (yes, him again) made a similar point about the measurement of the value of the charge of the electron in his classic Cargo Cult Science talk at Caltech in 1974:

One example: Millikan measured the charge on an electron by an experiment with falling oil drops and got an answer which we now know not to be quite right.  It’s a little bit off, because he had the incorrect value for the viscosity of air.  It’s interesting to look at the history of measurements of the charge of the electron, after Millikan.  If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

 Why didn’t they discover that the new number was higher right away?  It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong.  When they got a number closer to Millikan’s value they didn’t look so hard.  And so they eliminated the numbers that were too far off, and did other things like that.  We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.

Sorry, Richard, but that disease is very much still with us. If anything, it’s a little more virulent these days…

I could go on. But you get the idea. Only someone with a complete lack of experience of scientific research could ever suggest that the testing of scientific theories/ interpretations is free of “social factors and chance”.

What say you, “AntiCitizenX”…?

AntiCitizenX

So, apparently, the experience of scientists means nothing when it comes to understanding how science works? This viewpoint  — and it crops up regularly — never ceases to make me smile. The progress of science depends, fundamentally and critically, on the peer review process: decisions on which papers get published and which grants get funded are driven not by an adherence to one or other “philosophy of science” (which one?) but by working scientists.

The “messy day-to-day aspects of science” are science. This is how it works. It doesn’t matter a jot what Popper, Kuhn [1], Feyerabend, Lakatos or your particular philosopher of choice might have postulated when it comes to their preferred version of The Scientific Method. What matters is how science works in practice. (Do the experiment, right?) Popper et al. did not produce some type of received, immutable wisdom to which the scientific process must conform. (On a similar theme, And Then There’s Physics – more of whom later — has written a number of great posts on the simplistic caricatures of science that have often frustratingly stemmed from the Science and Technology Studies (STS) field of sociology, including this: STS: All Talk and No Walk?)

Does this mean that I think philosophy has no role to play in science or, more specifically, physics? Not at all. In fact, I think that we do our undergraduate (and postgraduate) students a major disservice by not introducing a great deal more philosophy into our physics courses. But to argue that scientists are somehow not qualified to speak about a process they themselves fundamentally direct is ceding rather too much ground to our colleagues in philosophy and sociology. And it’s deeply condescending to scientists.

As Sean Carroll so eloquently puts it in the paper to which I refer in the video,

The way in which we judge scientific theories is inescapably reflective, messy, and human. That’s the reality of how science is actually done; it’s a matter of judgement, not of drawing bright lines between truth and falsity, or science and non-science.

True or False?

Let’s now turn to the question of falsifiability (which was, after all, in the title of the video). Over to you, “Daniel Jensen”, as your comment seems to have resonated with quite a few:

DanielJensen.png

This fundamentally confuses the type of “bending over backwards to prove ourselves wrong” aspect of science — yes, Feynman again — with Popper’s falsifiability criterion. I draw a distinction between these in the video but, as was pointed out to me recently by Philip Ball, when it comes to many of those who contribute below the line “it’s as if they’re damned if they are going to let your actual words deprive them of their right to air their preconceived notions“.

(At least one commenter realises this:

AnsweredInVideo

Thank you, “Shkotay D”. I’d like to think so.)

The point I make in the video re. falsifiability merely echoes what Sokal and Bricmont (and others) said way back in the 90s, and Carroll has reiterated within the context of multiverse theory: Popper’s criterion simply does not describe how science works in practice. Here’s what Sokal and Bricmont have to say in Fashionable Nonsense:

When a theory successfully withstands an attempt at falsification, a scientist will, quite naturally, consider the theory to be partially confirmed and will accord it a greater likelihood or a higher subjective probability. … But Popper will have none of this: throughout his life he was a stubborn opponent of any idea of ‘confirmation’ of a theory, or even of its probability. … the history of science teaches us that scientific theories come to be accepted above all because of their successes.

The question of misinterpretation (wilful or otherwise) is also raised by “tennisdude52278”:

Anti-vaxxers

I stand by everything I said in that video. I am acutely aware of just how statements are cherry-picked, quote-mined, and ripped out of context online but that can’t be used as a justification to self-censor for the sake of “toeing the party line” or presenting a united front. Science isn’t politics, despite its messy character. It is both fundamentally dishonest and ultimately damaging to the credibility of science (and scientists) if we pretend otherwise.

We demand rigidly defined areas of doubt and uncertainty” [2]

What I find particularly intriguing about the more overwrought responses to the video is the deep unwillingness to accept the inherent uncertainties and human biases that are inevitably at play in the progress of science. There’s a deep-rooted, quasi-religious, faith in the ability of science to provide definitive, concrete, unassailable answers to questions of life, the universe, and everything. But that’s not how science works. Carlo Rovelli forcefully makes this point in Science Is Not About Certainty:

“The very expression “scientifically proven” is a contradiction in terms. There’s nothing that is scientifically proven. The core of science is the deep awareness that we have wrong ideas, we have prejudices…we have a vision of reality that is effective, it’s good, it’s the best we have found so far. It’s the most credible we have found so far; it’s mostly correct.”

The craving for certainty is, however, a particularly human characteristic. We’re pattern-seekers; we love to find regularity, even when there’s no regularity there. And there are some who know very well how to effectively exploit that desire for certainty. This article on the guru appeal of Jordan B Peterson highlights just how the University of Toronto professor of psychology plays to the gallery in fulfilling that need:

“He sees the vacuum left not just by the withdrawal of the Christian tradition, but by the moral relativism and self-abnegation that have flooded across the West in its wake. Furthermore, he recognizes — from his experience as a practicing psychologist and as a teacher — that people crave principles and certainties.”

In passing, I should note that I disagree with the characterisation of Peterson in that article as a man who espouses ideas of depth and substance. No. Really, no. (Really, really, no.) He’s of course an accomplished and charismatic public speaker (with a particular talent for obfuscation that rivals, worryingly, that of politicians.) But then so too is Deepak Chopra. [3]

I’ve spent rather too much of my time over the last year discussing Peterson’s self-help shtick in various fora on- and offline. I’m particularly grateful to And Then There’s Physics for highlighting a debate I had with Fred McVittie last year on a motion of particular relevance to this post, “Jordan Peterson speaks the truth“. The comments thread under ATTP’s post runs to over 400 comments, highlighting that the cult of Peterson is fascinating in terms of its social dynamics. Unfortunately, what Peterson himself has to say is a great deal less interesting, and often mind-numbingly banal, as compared to the underlying sociology of his flock.

What Peterson clearly recognises, however, is that certainty sells. Humans tend to crave simple and simplistic messages, free of the type of ambiguity that is so often part-and-parcel of the scientific process. So he dutifully, and profitably, becomes the source of memes and headline messages so simple that they can feature comfortably on the side of a mug of coffee:

PetersonCup.png

Comforting though Peterson’s simplistic and dogmatic rules for life might be for many, I much prefer the honesty that underpins Carl Sagan‘s rather more ambiguous and uncertain outlook…

Science demands a tolerance for ambiguity. Where we are ignorant, we withhold belief. Whatever annoyance the uncertainty engenders serves a higher purpose: It drives us to accumulate better data. This attitude is the difference between science and so much else.

 


 

[1] I’m not a fan of Kuhn’s writings, I’m afraid. I am well aware that “The Structure of Scientific Revolutions is held up as some sort of canonical gospel when it comes to the philosophy of science, but “…Scientific Revolutions” is merely Kuhn’s opinion. Nothing more, nothing less. It’s not the last word on the nature of progress in science.  For one thing, his views on the lack of “commensurability” of different paradigms are clearly bunkum in the context of quantum physics and relativity. The correspondence principle in QM alone is enough to rebut Kuhn’s incommensurability argument. And just how many undergrad physics students have been tasked in their first year to consider a problem in QM or special relativity in “the classical limit”…?

[2] Treat yourself to a nice big bowl of petunias if you recognise the source of the quote here.

[3] As an aside to the aside, what I find remarkable is that the subadolescent drawings and scribblings that decorate Peterson’s “Maps Of Meaning” were apparently offered to Harvard psychology undergraduates as part of their education. (Actually, that’s rather unfair to those adolescents who would be mortified at being linked in any way with the likes of these ravings.) Unlike Peterson, I’m not about to wring my hands, clutch my pearls, and call for a McCarthyite purge of undergraduate teaching in his discipline. But let’s just say that my confidence in the quality assurance mechanisms underpinning psychology education and research have been dented just a little. (Diederik Stapel’s autobiography also didn’t reassure me when it comes to the lack of reproducibility that plagues psychology research) I’ll concur entirely with Prof. Peterson on this point: it’s indeed best to get one’s own house in order before criticising others…