“Some down-to-earth blue sky thinking”

“… a dangerous convergence proceeds apace 

as human beings confer life on machines and

in so doing diminish themselves. 

Your calculus may be greater than his calculus 

but will it pass the Sullenberger Hudson river test?”

from “Insulting Machines”, Mike Cooley

(Published in AI and Society 28 373 (2013))


Last week, I listened to some of the most thought-provoking — and occasionally unsettling — presentations and discussions that I’ve encountered throughout my academic career. On Tuesday, I attended, and participated in, the 2019 Responsible Research and Innovation Conference (organised by Nottingham’s Graduate School and the Institute for Science and Society), while on Wednesday the School of Physics and Astronomy hosted the British Pugwash Ethical Science half-day conference:

More on both of these soon. But before I describe just why I found those conferences as affecting as I did, I wanted to highlight last Monday’s session for the Politics, Perception, and Philosophy of Physics (PPP) module. This was the first of this year’s PPP sessions where the students were given free rein to contribute via debate and discussion, and both Omar Almaini (the co-convenor of PPP) and myself were exceptionally impressed by their thoughtful and spirited contributions. (The first three sessions of PPP are in the traditional lecture format. Sessions 4 – 11 are much more akin to the seminar style that is common in arts and humanities disciplines but is very much not the norm in physics courses.)

I have always found the clichés surrounding the STEM vs arts & humanities divide extremely tiresome, and it’s a delight when our students demolish the lazy stereotypes regarding the supposed lack of communication skills of physicists. (Similarly, one of the invited speakers for PPP this year, the sociologist Harry Collins, has shown that social scientists can perform comparably to – or even better than — physicists when it comes to answering physics questions. See “Sociologist Fools Physics Judges” (Nature, 2006) for compelling evidence. More from (and about) Prof. Collins in future posts…)

The title of last Monday’s PPP session was “The Appliance (and non-applicance) of Science” and the slides are embedded below. (Those of you who, like myself, are of a certain vintage might recognise the tag line of the title.)

 

The students drove an hour-long discussion that initially focussed on the two questions given on Slide #3 of the PowerPoint file above but rapidly diverged to cover key related points such as science comms, public engagement, hostility to expertise, and political polarisation. The discussion could have extended much beyond an hour — there were still hands being raised after we’d been in the seminar room for 90 minutes. As is traditional for PPP, I noted down students’ points and questions on the whiteboard as the discussion proceeded. Here are just two of the eight whiteboards’ worth of material…

IMG_8048

IMG_8056

(The remainder of the slides are available at the PPP website.)

In case you can’t read my appalling hand-writing, one of the first points raised by the students was the following:

“Curiosity is more than a valid reason to fund research” 

This view kicked off a lot of discussion, culminating in the polar opposite view expressed at the bottom of the whiteboard summary below: “What’s the point of funding anything other than global warming research?”

IMG_8049

“Humanity came and destroyed the world”

The theme of the PPP session last Monday was chosen to align with the Responsible Research and Innovation (RRI2019) and Ethical Science conferences on the following days. This post would be 10,000 words long if I attempted to cover all of the key messages stemming from these conferences so I’ll focus on just a few highlights (out of very many). This story, by Dimitris Papadopoulos‘ daughter, was a sobering introduction to the motivations and agenda of RRI2019…

Dimitris was a driving force behind the organisation of RRI2019 (alongside colleagues in the Graduate School) and in his presentation he highlighted key aspects of the RRI framework that would recur time and again throughout the day: generational responsibility; designing for the future;  the realisation that what we create often has a lifespan far beyond our own; “the burden is not on the individual researcher” but we are collectively changing the planet.

He also stressed that, in his view, the primary task of science is not just to understand.

In the context of RRI I have a great deal of sympathy with Dimitris’ stance on this latter point. But I also found it rather unsettling because science that is as disinterested as possible and focussed solely on understanding the nature of the world/universe around us has to be a component of the research “landscape”, not least because, time and again throughout history, curiosity-driven science has led to truly disruptive innovations. (Some to the immense benefit of humanity; others less so, admittedly.) Moreover, we need to be exceptionally careful to retain the disinterested character of pure scientific research when it comes to ensuring public trust in just what we do — an issue to which I returned in another RRI2019 session (see below).

Prof. Sarah Sharples, PVC for Diversity, Equality, and Inclusion, was next to speak and made powerful and pointed arguments that senior university (and, indeed, University) management, politicians, and funding bodies of all stripes need to take on board: look beyond simplistic metrics and league tables when it comes to assessing what it means for research to be successful. Sarah highlighted the importance of unintended consequences, particularly when it comes to the ironies of automation; clinical care, in particular, is not just about recording numbers and data.

IMG_8068

Pete Licence, Professor of Chemistry and Director of The GlaxoSmithKline Carbon Neutral Laboratory, continued on the theme of being wary and cognisant of the possibility and potential of unintended consequences, but stressed that sometimes those consequences can be much more positive than we could have ever anticipated. Pete described his collaboration with a number of Ethiopian scientists, which has radically changed both his and their approach to not just the science but the economics associated with green chemistry. He also echoed Sarah Sharples’ key point on the matter of ensuring that we never lose sight of the humanity behind the metrics and tick-boxes: too many lenses mean that, paradoxically, we can often lose focus…

Maybe, Minister?

The RRI conference then split into parallel sessions. This unfortunately meant that I couldn’t go along to the Society and Responsibility discussion — which I was keen to attend (not least because my friend and colleague Brigitte Nerlich was a member of the panel) – as I was participating in the Responsibility in Research and Policy session happening at the same time, alongside Chris Sims (Head of Global Policy Impact at UoN and the Chair and organiser of the session), Steven Hill (Director of Research at Research England, and formerly Head of Policy at HEFCE), and Richard Masterman, UoN’s Associate PVC for Research Strategy and Performance. (All-male panels are never a good look but, in the organisers’ defence, the panel was not initially male only — the original speaker, Dr. Karen Salt (Director of the Centre for Research in Race and Rights at UoN), unfortunately couldn’t make it — and the parallel Society and Responsibility session involved an all-female panel.)

Steven and I have debated and discussed the issues surrounding HEFCE’s, and the research councils’, approach to research impact on a number of occasions — some more heated than others — over the years. (I was very pleased to find that we seem to have converged (give or take) on the middle ground after all these years.) After Chris framed the key themes of the panel discussion, we each had approximately ten mins to make our case. Steven’s ccontribution focussed on the core issue of just how research should (or should not) inform policy and just what RRI should look like in that “space”.

The trade-offs and tensions between researchers and politicians were a core theme of Steven’s argument. To a scientist, the answer to any question is invariably “More research is needed”; a politican, on the other hand, ideally has to make a decision, sometimes urgently, on the basis of the evidence at hand. And the last thing they want to be told is that more research is needed. This was also the resounding message I got at Westminster when I participated (along with my Physics & Astronomy colleague Clare Burrage) in the Royal Society’s MP-Scientist scheme back in 2013: science really is not as far up the pecking order as we scientists might like. For this reason, I enthusiastically recommend Chris Tyler‘s illuminating “Top 20 things scientists need to know about policy-making” to the PPP class every year.

Steven mentioned Roger Pielke Jr’s “honest broker” concept — whereby scientists should be entirely disinterested, fully objective reporters of “The Truth” (however that might be defined) when interacting with politicians and policy. In other words, any tendency towards activism — i.e. promoting a particular (geo)political standpoint — should be avoided entirely. I have major qualms with Pielke’s thesis but Ken Rice (aka “…And Then There’s Physics“) has dealt with these much more comprehensively and eloquently than I could ever manage.

I was also put in mind, on more than one occasion during Steven’s presentation, of “The Thick Of It” clip below (which also features in the PPP course each year. Apologies for the audio quality.)

Richard then outlined the University of Nottingham’s views on the policy-research interface, before I presented the following [1]:

 

The ensuing discussion amongst the panel members, with a lively Q&A from the floor, touched on many of the same points that had been raised during the PPP session the day before: the disinterestedness of research, basic vs applied science, polarisation in politics, trust in scientists (and other professions), the commercialisation of academic research (which was the subject of a particularly pointed question from Jane Calvert in the audience – more on whom below), and balancing public, political, academic, and commercial drivers.

Synthetic Aesthetics and The Wickedness of Global Challenges

In the first session after lunch, the aforementioned Prof. Calvert, of the School of Social and Political Science at Edinburgh, presented an enthralling keynote lecture entitled Responsible Innovation and Experimental Collaboration, in which se described her adventures in synthetic biology, with a particular focus on cross-disciplinary interactions between artists, scientists (of both the social and life variety), and designers.

IMG_8076.JPG

A particularly fascinating aspect of Prof. Calvert’s talk was the description of her work on the Synthetic Aesthetics project, from which a book (among many other “outputs”) has stemmed. I’ll quote directly from the blurb for the book because it captures the core message of Jane’s talk:

In this book, synthetic biologists, artists, designers, and social scientists investigate synthetic biology and design. After chapters that introduce the science and set the terms of the discussion, the book follows six boundary-crossing collaborations between artists and designers and synthetic biologists from around the world, helping us understand what it might mean to ‘design nature.’ These collaborations have resulted in biological computers that calculate form; speculative packaging that builds its own contents; algae that feeds on circuit boards; and a sampling of human cheeses. They raise intriguing questions about the scientific process, the delegation of creativity, our relationship to designed matter, and, the importance of critical engagement. Should these projects be considered art, design, synthetic biology, or something else altogether?

I have a long-standing interest in the interface between the arts and the sciences — see, for example, The Silent Poetry of Paint Drying, and these posts — so was fascinated by the interweaving of function, form, and, errmm, fungi in the Synthetic Aesthetics project…

IMG_8095.JPG

The second post-lunch keynote was from Prof. Phil McNaghten (Wageningen University & Research (WUR), Netherlands), whose work with Matthew Kearnes and James Wilsdon on the ESRC-funded “Governing At The Nanoscale: People, Policies, and Emerging Technologies” project (published in this Demos pamphlet) was more than partly responsible for sparking my nascent interest in the sociology of (nano)science and technology more than a decade ago. Phil’s talk at RRI2019 focussed on how RRI was embedded in practice and policy at the local (WUR), national (EPSRC), and international (Brazil, which is enduring vicious cuts to its science budget) levels.

The Sounds of (Responsible) Salesmen…

I unfortunately only caught the last fifteen minutes or so of the Molecules and Microbes parallel session — chaired by Pete Licence and featuring Prof Steve Howdle (Chemistry, Nottingham), Prof Liz Sockett & Dr Jess Tyson (Life Sciences, Nottingham), and Prof Panos Soultanas (Chemistry, Nottingham) — and so can’t really comment in detail. Panos’ impassioned plea for support for basic, curiosity-driven science certainly resonated, although I can’t say I entirely agreed with his suggestion that irresponsible research wasn’t an issue. (I may have misinterpreted what he meant, however — I didn’t catch all of his presentation.)

The closing plenary was expertly chaired by Dr. Alison Mohr, who introduced, in turn, Dr. Eleanor Kershaw (Synthetic Biology Centre, UoN), Prof. Richard Jones (Physics, University of Sheffield (and erstwhile PVC for Research and Innovation there), and Prof. Martyn Poliakoff. I have known Richard for over fifteen years and have always enjoyed his informed and engaging takes on everything from nanotechnology to transhumanism to the UK’s productivity crisis, via a variety of talks I’ve attended and his blog, Soft Machines. (I also had the pleasure of spending a week at an EPSRC sandpit back in 2007 that was coordinated and steered — in so far as it’s possible to steer a room-full of academics — by Prof. Jones.)

In his plenary, Richard stressed the “scientist as responsible salesman” theme that he has put forward previously (as one of many dimensions of responsibility.) For a characteristically comprehensive analysis of responsible innovation (and irresponsible stagnation), I thoroughly recommend this Soft Machines post.

Martyn Poliakoff brought the conference to a close in his ever-engaging and inimitable style, with a compelling vision of what he and his colleagues have described as a Moore’s law for chemistry,

… namely that over a given period, say five years, sustainable chemists should strive to reduce the amount of a chemical needed to produce a given effect by a factor of two and this process should be repeated for a number of cycles. The key will be to make the whole concept, especially the economics, work for everyone which will require a change in business model for the chemicals market.

[Quote taken from A New Approach to Sustainability: A Moore’s Law for Chemistry, M. Poliakoff, P. Licence, and M. George, Angew. Chem. Int. Ed. 57 12590 (2018)]

“Remember your humanity, and forget the rest.”

Although the word Pugwash has an alternative “resonance” for many of us kids of the sixties/seventies, the Pugwash Conferences on Science and World Affairs, and the subsequent International Student/Young Pugwash movement, take their name from the town in Nova Scotia, Canada where Joseph Roblat and Bertrand Russell established, in 1957, the international organisation to bring together scientists and public figures to address global security, armed conflict, and the threat of weapons of mass destruction (including, in particular, nuclear warfare). The Pugwash conferences were initiated two years after the Russell-Einstein manifesto was issued, which in turn stemmed from Russell’s deep fears about atomic weapons:

The prospect for the human race is sombre beyond all precedent. Mankind are faced with a clear-cut alternative: either we shall all perish, or we shall have to acquire some slight degree of common sense. A great deal of new political thinking will be necessary if utter disaster is to be averted.

Jo(seph) Roblat was awarded the Nobel Peace Prize in 1995 “for efforts to diminish the part played by nuclear arms in international affairs and, in the longer run, to eliminate such arms.” 

I have organised a number of joint events with British Pugwash — more specifically, with Andrew Gibson, the British Pugwash Student Manager — over the last few years, including a PPP seminar given back in Nov. 2016 by Prof. John Finney (UCL), Pugwash Trustee, and a tireless advocate for the organisation. Alongside Peter Jenkins, Chair of British Pugwash, John kicked off the Ethical Science conference at Nottingham last Wednesday with a fascinating account of the history of Pugwash and, in particular, Jo Rotblat’s inspiring life.

Rotblat.png

Dr. Ian Crossland then discussed the ethics and intergenerational issues surrounding nuclear power, followed by a stirring presentation by Sam Harris, climate activist and Nottingham Trent Labour Society’s campaigns officer, on Labour’s Green New Deal.

LauraNolan.pngA  particular highlight of not just the Pugwash conference but of all of last weeks’ events was Laura Nolan‘s remarkable presentation, delivered with tons of energy and passion. (I try to avoid the p-word, given that it’s an obnoxiously lazy cliche, but in this case it is more than justified.) Laura, a Trinity College Dublin computer science graduate, resigned from Google, where she was a software engineer, in 2017 after she was asked to work on a project whose focus was the enhancement of US miltary drone technology. Laura’s story is recounted in this important Guardian article. (See also this interview.) The quote below, from that article, captures the issues that Laura covered in her talk at the Pugwash conference.

“If you are testing a machine that is making its own decisions about the world around it then it has to be in real time. Besides, how do you train a system that runs solely on software how to detect subtle human behaviour or discern the difference between hunters and insurgents? How does the killing machine out there on its own flying about distinguish between the 18-year-old combatant and the 18-year-old who is hunting for rabbits?

Anuradha Damale — currently of Verification Research, Training and Information Centre, and a fellow physicist — had a tough act to follow but she delivered a great talk with quite some aplomb, despite having lost her voice! Anuradha covered the troublesome issue of nuclear weapons verification programmes, and despite the lack of vocal volume, participated in a lively Q&A session with Laura following their talks.

I’m going to close this post with the source of its title: “Down-to-earth blue sky thinking”. The inspiring video embedded below was shown by Tony Simpson — who also discussed Mike Cooley’s pioneering work on the influence of technology on society (and whose prose poem, “Insulting Machines“, is quoted above) — during the closing presentation of the Pugwash conference.

I’ve waffled on for much too long at this point. Let’s hear instead from those whose actions spoke so much louder than words…

 


 

[1] It’s unfortunately not clear from the embedded SlideShare widget of the slides but I cited (and quoted from) this influential blog post when crediting Gemma Derrick and Paul Benneworth with coining the “grimpact” term.

Induction, Deduction, Reduction

This is Lecture #2 of “The Politics, Perception, and Philosophy of Physics” (PPP) module. (Lecture #1 is here.) The PowerPoint slides are also embedded below. (The blog post referenced on Slide #25 is here.)

Science Proves Nothing

Here’s the first, provocatively titled, lecture for this year’s “Politics, Perception, and Philosophy of Physics” module. This year, I plan to upload video here for each F34PPP session on a weekly schedule (although the best laid plans aft gang agley…)

Erratum: Around about the 43 minute mark I say “Polish group” when I mean “Czech group”. (Apologies to Pavel Jelinek et al.)

Down On The Upside

I stumbled across the wonderful skepticalscience.com website last night (via Ken Rice‘s Twitter feed) and just had to quickly blog about this brilliant, at-a-glance rebuttal of that hoary old “The data don’t lie” aphorism. The graph speaks for itself…

 “But Philip, I thought you’d sworn off Twitter?” I have — I killed my Twitter account almost four years ago and have not once regretted it since. For one thing, a Twitter account is not required in order to read tweets and I occasionally dip into the Twitter threads of colleagues and friends I used to follow (Ken among them) via search.twitter.com.

“We don’t need no education…”

(…or Why It Sometimes Might Be Better For Us Academics to Shut The F**k Up Occasionally.)

Boost Public Engagement to Beat Pseudoscience, says Jim Al-Khalili” goes the headline on p.19 of this week’s Times Higher Education, my traditional Saturday teatime read. The brief article, a summary of points Jim made during his talk at the Young Universities Summit, continues…

Universities must provide more opportunities for academics to engage with the public or risk allowing pseudoscience to “fill the vacuum”, according to Jim Al-Khalili.

Prof. Al-Khalili is an exceptionally talented and wonderfully engaging science communicator. I enjoy, and very regularly recommend (to students and science enthusiasts of all stripes), his books and his TV programmes. But the idea that education and academic engagement are enough to counter pseudoscience is, at the very best, misleading and, at worst, a dangerous and counter-productive message to propagate.

The academic mantra of “education, education, education” as the unqualified panacea for every socioeconomic ill, although comforting, is almost always a much too simplistic — and, for some who don’t share our ideological leanings, irritatingly condescending — approach. I’ve written enthusiastically before about Tom Nichols’ powerful “The Death of Expertise”, and I’ve lost count of the number of times that I’ve referred to David McRaney’s The Backfire Effect in previous posts and articles I’ve written. It does no harm to quote McRaney one more time…

The last time you got into, or sat on the sidelines of, an argument online with someone who thought they knew all there was to know about health care reform, gun control, gay marriage, climate change, sex education, the drug war, Joss Whedon or whether or not 0.9999 repeated to infinity was equal to one – how did it go?

Did you teach the other party a valuable lesson? Did they thank you for edifying them on the intricacies of the issue after cursing their heretofore ignorance, doffing their virtual hat as they parted from the keyboard a better person?

Perhaps you’ve been more fortunate than McRaney (and me.) But somehow I doubt it.

As just one example from McRaney’s list, there is strong and consistent evidence that, in the U.S., Democrats are much more inclined to accept the evidence for anthropogenic climate change than Republicans. That’s bad enough, but the problem of political skew in motivated rejection of science is much broader. A very similar and very distinct right-left asymmetry exists across the board, as discussed in Lewandowsky and Oberauer’s influential paper, Motivated Rejection Of Science. I’ll quote from their abstract, where they make the same argument as McRaney but in rather more academic, though no less compelling, terms [1]:

Rejection of scientific findings is mostly driven by motivated cognition: People tend to reject findings that threaten their core beliefs or worldview. At present, rejection of scientific findings by the U.S. public is more prevalent on the political right than the left. Yet the cognitive mechanisms driving rejection of science, such as the superficial processing of evidence toward the desired interpretation, are found regardless of political orientation. General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.

Let me repeat and bolden that last line for emphasis. It’s exceptionally important.


General education and scientific literacy do not mitigate rejection of science but, rather, increase the polarization of opinions along partisan lines.


If we blithely assume that the rejection of well-accepted scientific findings — and the potential subsequent descent into the cosy embrace of pseudoscience — is simply a matter of a lack of education and engagement, we fail to recognise the complex and multi-facetted sociology and psychology at play here. Yes, we academics need to get out there and talk about the research we and others do — and I’m rather keen on doing this myself (as discussed here, here, and here) — but let’s not make the mistake that there’s always a willing audience waiting with bated breath for the experts to come and correct them on what they’re getting wrong.

I spend a lot of time on public engagement, both online and off — although not, admittedly, as much as Jim — and I’ve encountered the “motivated rejection” effect time and time again over the years. Here’s just one example of what I mean — a comment posted under the most recent Computerphile video I did with Sean Riley:

ZeroCred

The “zero credibility” comment stems not from the science presented in the video but from a reaction to my particular ideological and political leanings. For reasons I’ve discussed at length previously, I’ve been labelled as an “SJW” — a badge I’m happy to wear with quite some pride. (If you’ve not encountered the SJW perjorative previously, lucky you. Here’s a primer.) Because of my SJW leanings, the science I present, regardless of its accuracy (and level of supporting evidence/research), is immediately rejected by a subset of aggrieved individuals who do not share my political outlook. They outright dismiss the credibility or validity of the science not on the basis of the content or the strength of the data/evidence but solely on their ideological, emotional, and knee-jerk reaction to me…

Downvoting

(That screenshot above is taken from the comments section for this video.)

It’s worth noting that the small hardcore of viewers who regularly downvote and leave comments about the ostensible lack of credibility of the science I present are very often precisely those who would claim to be ever-so-rational and whose clarion call is “Facts over feels” [1]. Yet they are so opposed to my “SJW-ism” that they reject everything I say, on any topic, as untrustworthy; they cannot get beyond their gut-level emotional reaction to me.

My dedicated following of haters is a microcosm of the deep political polarisation we’re seeing online, with science caught in the slip-stream and accepted/rejected on the basis of how it appeals to a given worldview, rather than on the strength of the scientific evidence itself. (And it’s always fun to be told exactly how science works by those who have never carried out an experiment, published a paper, been a member of a peer-review panel, reviewed a grant etc.) This then begs the question: Am I, as a left-leaning academic with clearly diabolical SJW tendencies, in any position at all to educate this particular audience on any topic? Of course not. No matter how much scientific data and evidence I provide it will be dismissed out of hand because I am not of their tribe.[3]

Jim Al-Khalili’s argument at the Young Universities Summit that what’s required is ever-more education and academic engagement is, in essence, what sociologists and Science and Technology Studies (STS) experts would describe as the deficit model. The deficit model has been widely discredited because it simply does not accurately describe how we modify our views (or not) in the light of more information. (At the risk of making …And Then There’s Physics  scream, I encourage you to read their informative and entertaining posts on the theme of the deficit model.)

Prof. Al-Khalili is further reported as stating that “…to some extent, you do have to stand up and you do have to bang on about evidence and rationalism, because if we don’t, we will make the same mistakes of the past where the vacuum will be filled with people talking pseudoscience or nonsense.” 

Banging on about evidence and rationalism will have close to zero effect on ideologically opoosed audiences because they already see themselves as rational and driven by evidence [3]; they won’t admit to being biased and irrational because their bias is unconscious. And we are all guilty of succumbing to unconscious bias, to a greater or lesser extent. Force-feeding  more data and evidence to those with whom we disagree is not only unlikely to change their minds, it’s much more likely to entrench them further in their views. (McRaney, passim.)

Let me make a radical suggestion. What if we academics decided to engage rather less sometimes? After all, who is best placed to sway the position — on climate change, vaccination, healthcare, social welfare, or just about any topic — of a deeply anti-establishment Trump supporter who has fallen hook, line, and sinker for the “universities are hotbeds of cultural Marxism” meme? A liberal academic who can trot out chapter and verse from the literature, and present watertight quantitative (and qualitative) arguments ?

Of course not.

We need to connect, somehow, beyond the level of raw data and evidence. We need to appeal to that individual’s biases and psychology. And that means thinking more cannily, and more politically, about how we influence a community. Barking, or even gently reciting, facts and figures is not going to work. This is uncomfortable for any scientist, I know. But you don’t need to take my word for it — review the evidence for yourself.

The strength of the data used to support a scientific argument almost certainly won’t make a damn bit of difference when a worldview or ideology is challenged. And that’s not because our audience is uneducated. Nor are they unintelligent. They are behaving exactly as we do. They are protecting their worldview via the backfire effect.

 


[1] One might credibly argue that the rejection skew could lean the other way on certain topics such as the anti-vaccination debate, where anecdotal, and other, evidence might suggest that there is a stronger liberal/left bias. It turns out that even when it comes to anti-vaxxers, there is quite a considerable amount of data to support that it’s the right that has a higher degree of anti-science bias [2]. Here’s one key example: Trust In Scientists On Climate Change and Vaccines, LC Hamilton, J Hartter, and K Saito,  SAGE Open, July – Sept 2015, 1 – 13. See also Beyond Misinformation, S. Lewandowsky, U. K. H. Ecker, and J. Cook, J. Appl. Res. Memory. Cogn. 6 353 (2017) for a brief review of some of the more important literature on this topic.

[2] …but then it’s all lefty, liberal academics writing these papers, right? They would say that.

[3] Here’s an amusing recent example of numerological nonsense being passed off as scientific reasoning. Note that Peter Coles’ correspondent claims that the science is on his side. How persuasive do you think he’ll find Peter’s watertight, evidence-based reasoning to be? How should he be further persauded? Will more scientific evidence and data do the trick?

 

If it seems obvious, it probably isn’t

…And Then There’s Physics’ post on science communication, reblogged below, very much struck a chord with me. This point, in particular, is simply not as widely appreciated as it should be:

“Maybe what we should do more of is make it clear that the process through which we develop scientific knowledge is far more complicated than it may, at first, seem.”

There can too often be a deep-seated faith in the absolute objectivity and certainty of “The Scientific Method”, which possibly stems (at least in part) from our efforts to not only simplify but to “sell” our science to a wide audience. The viewer response to a Sixty Symbols video on the messiness of the scientific process, “Falsifiability and Messy Science”, brought this home to me: The Truth, The Whole Truth, and Nothing But…

(…but I’ve worried for a long time that I’ve been contributing to exactly the problem ATTP describes: Guilty Confessions of a YouTube Physicist)

By the way, if you’re not subscribed to ATTP’s blog, I heartily recommend that you sign up right now.

...and Then There's Physics

There’s an interesting paper that someone (I forget who) highlighted on Twitter. It’a about when science becomes too easy. The basic idea is that there are pitfalls to popularising scientific information.

Compared to experts,

laypeople have not undergone any specialized training in a particular domain. As a result, they do not possess the deep-level background knowledge and relevant experience that a competent evaluation of science-related knowledge claims would require.

However, in the process of communicating, and popularising, science, science communicators tend to provide simplified explanations of scientific topics that can

lead[s] readers to underestimate their dependence on experts and conclude that they are capable of evaluating the veracity, relevance, and sufficiency of the contents.

I think that this is an interesting issue and it partly what motivated my post about public involvement in science.

However, I am slightly uneasy about this general framing. I think everyone is a…

View original post 449 more words

Beauty and the Biased

A big thank you to Matin Durrani for the invitation to provide my thoughts on the Strumia saga — see “The Worm That (re)Turned” and “The Natural Order of Things?” for previous posts on this topic — for this month’s issue of Physics World. PW kindly allows me to make the pdf of the Opinion piece available here at Symptoms. The original version (with hyperlinks intact) is also below.

(And while I’m at it, an even bigger thank you to Matin, Tushna, and all at PW for this immensely flattering (and entirely undeserved, given the company I’m in) accolade…


From Physics World, Dec. 2018.

A recent talk at CERN about gender in physics highlights that biases remain widespread, Philip Moriarty says we need to do more to tackle such issues head on

When Physics World asked several physicists to name their favourite books for the magazine’s 30th anniversary issue, I knew immediately what I would choose (see October pp 74-78). My “must-read” pick was Sabine Hossenfelder’s exceptionally important Lost In Math: How Beauty Leads Physics Astray, which was released earlier this year.

Hossenfelder, a physicist based at the Frankfurt Institute of Technology, is an engaging and insightful writer who is funny, self-deprecating, and certainly not afraid to give umbrage. I enjoyed the book immensely, being taken on a journey through modern theoretical physics in which Hossenfelder attempts to make sense of her profession. If there is one chapter of the book that particularly resonated with me it’s the concluding Chapter 10, “Knowledge is Power”. This is a powerful closing statement that deserves to be widely read by all scientists, but especially by that especially irksome breed of physicist who believes — when all evidence points to the contrary — that they are somehow immune to the social and cognitive biases that affect every other human.

In “Knowledge is Power”, Hossenfelder adeptly outlines the primary biases that all good scientists have striven to avoid ever since the English philosopher Francis Bacon identified his “idols of the tribe” – i.e. the tendency of human nature to prefer certain types of incorrect conclusions. Her pithy single-line summary at the start of the chapter captures the key issue: “In which I conclude the world would be a better place if everyone listened to me”.

Lost in bias

Along with my colleague Omar Almaini from the University of Nottingham, I teach a final-year module entitled “The Politics, Perception, and Philosophy of Physics”. I say teach, but in fact, most of the module consists of seminars that introduce a topic for students to then debate, discuss and argue for the remaining time. We dissect Richard Feynman’s oft-quoted definition of science: “Science is the belief in the ignorance of experts”.  Disagreeing with Feynman is never a comfortable position to adopt, but I think he does science quite a disservice here. The ignorance, and sometimes even the knowledge, of experts underpins the entire scientific effort. After all, collaboration, competition and peer review are the lifeblood of what we do. With each of these come complex social interactions and dynamics and — no matter how hard we try — bias. For this and many other reasons, Lost In Math is now firmly on the module reading list.

At a CERN workshop on high-energy theory and gender at the end of September, theoretical physicist Alessandro Strumia from the University of Pisa claimed that women with fewer citations were being hired over men with greater numbers of citations. Following the talk, Strumia faced an immediate backlash in which CERN suspended him pending an investigation, while some 4000 scientists signed a letter that called his talk “disgraceful”. Strumia’s talk was poorly researched, ideologically-driven, and an all-round embarrassingly biased tirade against women in physics. I suggest that Strumia needs to take a page — or many — out of Hossenfelder’s book. I was reminded of her final chapter time and time again when I read through Strumia’s cliché-ridden and credulous arguments, his reactionary pearl-clutching palpable from almost every slide of his presentation.

One criticism that has been levelled at Hossenfelder’s analysis is that it does not offer solutions to counter the type of biases that she argues are prevalent in the theoretical-physics community and beyond. Yet Hossenfelder does devote an appendix — admittedly rather short — to listing some pragmatic suggestions for tackling the issues discussed in the book. These include learning about, and thus tackling, social and cognitive biases.

This is all well and good, except that there are none so blind as those that will not see. The type of bias that Strumia’s presentation exemplified is deeply engrained. In my experience, his views are hardly fringe, either within or outside the physics community — one need only look to the social media furore over James Damore’s similarly pseudoscientific ‘analysis’ of gender differences in the context of his overwrought “Google Manifesto” last year. Just like Damore, Strumia is being held up by the usual suspects as the ever-so-courageous rational scientist speaking “The Truth”, when, of course, he’s entirely wedded to a glaringly obvious ideology and unscientifically cherry-picks his data accordingly. In a masterfully acerbic and exceptionally timely blog post published soon after the Strumia storm broke (“The Strumion. And On”), his fellow particle physicist Jon Butterworth (UCL) highlighted a number of the many fundamental flaws at the core of Strumia’s over-emotional polemic.   .

Returning to Hossenfelder’s closing chapter, she highlights there that the “mother of all biases” is the “bias blind spot”, or the insistence that we certainly are not biased:

“It’s the reason my colleagues only laugh when I tell them biases are a problem, and why they dismiss my ‘social arguments’, believing they are not relevant to scientific discourse,” she writes. “But the existence of those biases has been confirmed in countless studies. And there is no indication whatsoever that intelligence protects against them; research studies have found no links between cognitive ability and thinking biases.”

Strumia’s diatribe is the perfect example of this bias blind spot in action. His presentation is also a case study in confirmation bias. If only he had taken the time to read and absorb Hossenfelder’s writing, Strumia might well have saved himself the embarrassment of attempting to pass off pseudoscientific guff as credible analysis.

While the beauty of maths leads physics astray, it is ugly bias that will keep us in the dark.