“Some down-to-earth blue sky thinking”

“… a dangerous convergence proceeds apace 

as human beings confer life on machines and

in so doing diminish themselves. 

Your calculus may be greater than his calculus 

but will it pass the Sullenberger Hudson river test?”

from “Insulting Machines”, Mike Cooley

(Published in AI and Society 28 373 (2013))


Last week, I listened to some of the most thought-provoking — and occasionally unsettling — presentations and discussions that I’ve encountered throughout my academic career. On Tuesday, I attended, and participated in, the 2019 Responsible Research and Innovation Conference (organised by Nottingham’s Graduate School and the Institute for Science and Society), while on Wednesday the School of Physics and Astronomy hosted the British Pugwash Ethical Science half-day conference:

More on both of these soon. But before I describe just why I found those conferences as affecting as I did, I wanted to highlight last Monday’s session for the Politics, Perception, and Philosophy of Physics (PPP) module. This was the first of this year’s PPP sessions where the students were given free rein to contribute via debate and discussion, and both Omar Almaini (the co-convenor of PPP) and myself were exceptionally impressed by their thoughtful and spirited contributions. (The first three sessions of PPP are in the traditional lecture format. Sessions 4 – 11 are much more akin to the seminar style that is common in arts and humanities disciplines but is very much not the norm in physics courses.)

I have always found the clichés surrounding the STEM vs arts & humanities divide extremely tiresome, and it’s a delight when our students demolish the lazy stereotypes regarding the supposed lack of communication skills of physicists. (Similarly, one of the invited speakers for PPP this year, the sociologist Harry Collins, has shown that social scientists can perform comparably to – or even better than — physicists when it comes to answering physics questions. See “Sociologist Fools Physics Judges” (Nature, 2006) for compelling evidence. More from (and about) Prof. Collins in future posts…)

The title of last Monday’s PPP session was “The Appliance (and non-applicance) of Science” and the slides are embedded below. (Those of you who, like myself, are of a certain vintage might recognise the tag line of the title.)

 

The students drove an hour-long discussion that initially focussed on the two questions given on Slide #3 of the PowerPoint file above but rapidly diverged to cover key related points such as science comms, public engagement, hostility to expertise, and political polarisation. The discussion could have extended much beyond an hour — there were still hands being raised after we’d been in the seminar room for 90 minutes. As is traditional for PPP, I noted down students’ points and questions on the whiteboard as the discussion proceeded. Here are just two of the eight whiteboards’ worth of material…

IMG_8048

IMG_8056

(The remainder of the slides are available at the PPP website.)

In case you can’t read my appalling hand-writing, one of the first points raised by the students was the following:

“Curiosity is more than a valid reason to fund research” 

This view kicked off a lot of discussion, culminating in the polar opposite view expressed at the bottom of the whiteboard summary below: “What’s the point of funding anything other than global warming research?”

IMG_8049

“Humanity came and destroyed the world”

The theme of the PPP session last Monday was chosen to align with the Responsible Research and Innovation (RRI2019) and Ethical Science conferences on the following days. This post would be 10,000 words long if I attempted to cover all of the key messages stemming from these conferences so I’ll focus on just a few highlights (out of very many). This story, by Dimitris Papadopoulos‘ daughter, was a sobering introduction to the motivations and agenda of RRI2019…

Dimitris was a driving force behind the organisation of RRI2019 (alongside colleagues in the Graduate School) and in his presentation he highlighted key aspects of the RRI framework that would recur time and again throughout the day: generational responsibility; designing for the future;  the realisation that what we create often has a lifespan far beyond our own; “the burden is not on the individual researcher” but we are collectively changing the planet.

He also stressed that, in his view, the primary task of science is not just to understand.

In the context of RRI I have a great deal of sympathy with Dimitris’ stance on this latter point. But I also found it rather unsettling because science that is as disinterested as possible and focussed solely on understanding the nature of the world/universe around us has to be a component of the research “landscape”, not least because, time and again throughout history, curiosity-driven science has led to truly disruptive innovations. (Some to the immense benefit of humanity; others less so, admittedly.) Moreover, we need to be exceptionally careful to retain the disinterested character of pure scientific research when it comes to ensuring public trust in just what we do — an issue to which I returned in another RRI2019 session (see below).

Prof. Sarah Sharples, PVC for Diversity, Equality, and Inclusion, was next to speak and made powerful and pointed arguments that senior university (and, indeed, University) management, politicians, and funding bodies of all stripes need to take on board: look beyond simplistic metrics and league tables when it comes to assessing what it means for research to be successful. Sarah highlighted the importance of unintended consequences, particularly when it comes to the ironies of automation; clinical care, in particular, is not just about recording numbers and data.

IMG_8068

Pete Licence, Professor of Chemistry and Director of The GlaxoSmithKline Carbon Neutral Laboratory, continued on the theme of being wary and cognisant of the possibility and potential of unintended consequences, but stressed that sometimes those consequences can be much more positive than we could have ever anticipated. Pete described his collaboration with a number of Ethiopian scientists, which has radically changed both his and their approach to not just the science but the economics associated with green chemistry. He also echoed Sarah Sharples’ key point on the matter of ensuring that we never lose sight of the humanity behind the metrics and tick-boxes: too many lenses mean that, paradoxically, we can often lose focus…

Maybe, Minister?

The RRI conference then split into parallel sessions. This unfortunately meant that I couldn’t go along to the Society and Responsibility discussion — which I was keen to attend (not least because my friend and colleague Brigitte Nerlich was a member of the panel) – as I was participating in the Responsibility in Research and Policy session happening at the same time, alongside Chris Sims (Head of Global Policy Impact at UoN and the Chair and organiser of the session), Steven Hill (Director of Research at Research England, and formerly Head of Policy at HEFCE), and Richard Masterman, UoN’s Associate PVC for Research Strategy and Performance. (All-male panels are never a good look but, in the organisers’ defence, the panel was not initially male only — the original speaker, Dr. Karen Salt (Director of the Centre for Research in Race and Rights at UoN), unfortunately couldn’t make it — and the parallel Society and Responsibility session involved an all-female panel.)

Steven and I have debated and discussed the issues surrounding HEFCE’s, and the research councils’, approach to research impact on a number of occasions — some more heated than others — over the years. (I was very pleased to find that we seem to have converged (give or take) on the middle ground after all these years.) After Chris framed the key themes of the panel discussion, we each had approximately ten mins to make our case. Steven’s ccontribution focussed on the core issue of just how research should (or should not) inform policy and just what RRI should look like in that “space”.

The trade-offs and tensions between researchers and politicians were a core theme of Steven’s argument. To a scientist, the answer to any question is invariably “More research is needed”; a politican, on the other hand, ideally has to make a decision, sometimes urgently, on the basis of the evidence at hand. And the last thing they want to be told is that more research is needed. This was also the resounding message I got at Westminster when I participated (along with my Physics & Astronomy colleague Clare Burrage) in the Royal Society’s MP-Scientist scheme back in 2013: science really is not as far up the pecking order as we scientists might like. For this reason, I enthusiastically recommend Chris Tyler‘s illuminating “Top 20 things scientists need to know about policy-making” to the PPP class every year.

Steven mentioned Roger Pielke Jr’s “honest broker” concept — whereby scientists should be entirely disinterested, fully objective reporters of “The Truth” (however that might be defined) when interacting with politicians and policy. In other words, any tendency towards activism — i.e. promoting a particular (geo)political standpoint — should be avoided entirely. I have major qualms with Pielke’s thesis but Ken Rice (aka “…And Then There’s Physics“) has dealt with these much more comprehensively and eloquently than I could ever manage.

I was also put in mind, on more than one occasion during Steven’s presentation, of “The Thick Of It” clip below (which also features in the PPP course each year. Apologies for the audio quality.)

Richard then outlined the University of Nottingham’s views on the policy-research interface, before I presented the following [1]:

 

The ensuing discussion amongst the panel members, with a lively Q&A from the floor, touched on many of the same points that had been raised during the PPP session the day before: the disinterestedness of research, basic vs applied science, polarisation in politics, trust in scientists (and other professions), the commercialisation of academic research (which was the subject of a particularly pointed question from Jane Calvert in the audience – more on whom below), and balancing public, political, academic, and commercial drivers.

Synthetic Aesthetics and The Wickedness of Global Challenges

In the first session after lunch, the aforementioned Prof. Calvert, of the School of Social and Political Science at Edinburgh, presented an enthralling keynote lecture entitled Responsible Innovation and Experimental Collaboration, in which se described her adventures in synthetic biology, with a particular focus on cross-disciplinary interactions between artists, scientists (of both the social and life variety), and designers.

IMG_8076.JPG

A particularly fascinating aspect of Prof. Calvert’s talk was the description of her work on the Synthetic Aesthetics project, from which a book (among many other “outputs”) has stemmed. I’ll quote directly from the blurb for the book because it captures the core message of Jane’s talk:

In this book, synthetic biologists, artists, designers, and social scientists investigate synthetic biology and design. After chapters that introduce the science and set the terms of the discussion, the book follows six boundary-crossing collaborations between artists and designers and synthetic biologists from around the world, helping us understand what it might mean to ‘design nature.’ These collaborations have resulted in biological computers that calculate form; speculative packaging that builds its own contents; algae that feeds on circuit boards; and a sampling of human cheeses. They raise intriguing questions about the scientific process, the delegation of creativity, our relationship to designed matter, and, the importance of critical engagement. Should these projects be considered art, design, synthetic biology, or something else altogether?

I have a long-standing interest in the interface between the arts and the sciences — see, for example, The Silent Poetry of Paint Drying, and these posts — so was fascinated by the interweaving of function, form, and, errmm, fungi in the Synthetic Aesthetics project…

IMG_8095.JPG

The second post-lunch keynote was from Prof. Phil McNaghten (Wageningen University & Research (WUR), Netherlands), whose work with Matthew Kearnes and James Wilsdon on the ESRC-funded “Governing At The Nanoscale: People, Policies, and Emerging Technologies” project (published in this Demos pamphlet) was more than partly responsible for sparking my nascent interest in the sociology of (nano)science and technology more than a decade ago. Phil’s talk at RRI2019 focussed on how RRI was embedded in practice and policy at the local (WUR), national (EPSRC), and international (Brazil, which is enduring vicious cuts to its science budget) levels.

The Sounds of (Responsible) Salesmen…

I unfortunately only caught the last fifteen minutes or so of the Molecules and Microbes parallel session — chaired by Pete Licence and featuring Prof Steve Howdle (Chemistry, Nottingham), Prof Liz Sockett & Dr Jess Tyson (Life Sciences, Nottingham), and Prof Panos Soultanas (Chemistry, Nottingham) — and so can’t really comment in detail. Panos’ impassioned plea for support for basic, curiosity-driven science certainly resonated, although I can’t say I entirely agreed with his suggestion that irresponsible research wasn’t an issue. (I may have misinterpreted what he meant, however — I didn’t catch all of his presentation.)

The closing plenary was expertly chaired by Dr. Alison Mohr, who introduced, in turn, Dr. Eleanor Kershaw (Synthetic Biology Centre, UoN), Prof. Richard Jones (Physics, University of Sheffield (and erstwhile PVC for Research and Innovation there), and Prof. Martyn Poliakoff. I have known Richard for over fifteen years and have always enjoyed his informed and engaging takes on everything from nanotechnology to transhumanism to the UK’s productivity crisis, via a variety of talks I’ve attended and his blog, Soft Machines. (I also had the pleasure of spending a week at an EPSRC sandpit back in 2007 that was coordinated and steered — in so far as it’s possible to steer a room-full of academics — by Prof. Jones.)

In his plenary, Richard stressed the “scientist as responsible salesman” theme that he has put forward previously (as one of many dimensions of responsibility.) For a characteristically comprehensive analysis of responsible innovation (and irresponsible stagnation), I thoroughly recommend this Soft Machines post.

Martyn Poliakoff brought the conference to a close in his ever-engaging and inimitable style, with a compelling vision of what he and his colleagues have described as a Moore’s law for chemistry,

… namely that over a given period, say five years, sustainable chemists should strive to reduce the amount of a chemical needed to produce a given effect by a factor of two and this process should be repeated for a number of cycles. The key will be to make the whole concept, especially the economics, work for everyone which will require a change in business model for the chemicals market.

[Quote taken from A New Approach to Sustainability: A Moore’s Law for Chemistry, M. Poliakoff, P. Licence, and M. George, Angew. Chem. Int. Ed. 57 12590 (2018)]

“Remember your humanity, and forget the rest.”

Although the word Pugwash has an alternative “resonance” for many of us kids of the sixties/seventies, the Pugwash Conferences on Science and World Affairs, and the subsequent International Student/Young Pugwash movement, take their name from the town in Nova Scotia, Canada where Joseph Roblat and Bertrand Russell established, in 1957, the international organisation to bring together scientists and public figures to address global security, armed conflict, and the threat of weapons of mass destruction (including, in particular, nuclear warfare). The Pugwash conferences were initiated two years after the Russell-Einstein manifesto was issued, which in turn stemmed from Russell’s deep fears about atomic weapons:

The prospect for the human race is sombre beyond all precedent. Mankind are faced with a clear-cut alternative: either we shall all perish, or we shall have to acquire some slight degree of common sense. A great deal of new political thinking will be necessary if utter disaster is to be averted.

Jo(seph) Roblat was awarded the Nobel Peace Prize in 1995 “for efforts to diminish the part played by nuclear arms in international affairs and, in the longer run, to eliminate such arms.” 

I have organised a number of joint events with British Pugwash — more specifically, with Andrew Gibson, the British Pugwash Student Manager — over the last few years, including a PPP seminar given back in Nov. 2016 by Prof. John Finney (UCL), Pugwash Trustee, and a tireless advocate for the organisation. Alongside Peter Jenkins, Chair of British Pugwash, John kicked off the Ethical Science conference at Nottingham last Wednesday with a fascinating account of the history of Pugwash and, in particular, Jo Rotblat’s inspiring life.

Rotblat.png

Dr. Ian Crossland then discussed the ethics and intergenerational issues surrounding nuclear power, followed by a stirring presentation by Sam Harris, climate activist and Nottingham Trent Labour Society’s campaigns officer, on Labour’s Green New Deal.

LauraNolan.pngA  particular highlight of not just the Pugwash conference but of all of last weeks’ events was Laura Nolan‘s remarkable presentation, delivered with tons of energy and passion. (I try to avoid the p-word, given that it’s an obnoxiously lazy cliche, but in this case it is more than justified.) Laura, a Trinity College Dublin computer science graduate, resigned from Google, where she was a software engineer, in 2017 after she was asked to work on a project whose focus was the enhancement of US miltary drone technology. Laura’s story is recounted in this important Guardian article. (See also this interview.) The quote below, from that article, captures the issues that Laura covered in her talk at the Pugwash conference.

“If you are testing a machine that is making its own decisions about the world around it then it has to be in real time. Besides, how do you train a system that runs solely on software how to detect subtle human behaviour or discern the difference between hunters and insurgents? How does the killing machine out there on its own flying about distinguish between the 18-year-old combatant and the 18-year-old who is hunting for rabbits?

Anuradha Damale — currently of Verification Research, Training and Information Centre, and a fellow physicist — had a tough act to follow but she delivered a great talk with quite some aplomb, despite having lost her voice! Anuradha covered the troublesome issue of nuclear weapons verification programmes, and despite the lack of vocal volume, participated in a lively Q&A session with Laura following their talks.

I’m going to close this post with the source of its title: “Down-to-earth blue sky thinking”. The inspiring video embedded below was shown by Tony Simpson — who also discussed Mike Cooley’s pioneering work on the influence of technology on society (and whose prose poem, “Insulting Machines“, is quoted above) — during the closing presentation of the Pugwash conference.

I’ve waffled on for much too long at this point. Let’s hear instead from those whose actions spoke so much louder than words…

 


 

[1] It’s unfortunately not clear from the embedded SlideShare widget of the slides but I cited (and quoted from) this influential blog post when crediting Gemma Derrick and Paul Benneworth with coining the “grimpact” term.

Guilty Confessions of a REFeree

#4 of an occasional series

At the start of this week I spent a day in a room in a university somewhat north of Nottingham with a stack of research papers and a pile of grading sheets. Along with a fellow physicist from a different university (located even further north of Nottingham), I had been asked to act as an external reviewer for the department’s mock REF assessment.

I found it a deeply uncomfortable experience. My discomfort had nothing to do, of course, with our wonderfully genial hosts — thank you all for the hospitality, the conversation, the professionalism, and, of course, lunch. But I’ve vented my spleen previously on the lack of consistency in mock REF ratings (it’s been the most-viewed post at Symptoms… since I resurrected the blog in June last year) and I agreed to participate in the mock assessment so I could see for myself how the process works in practice.

Overall, I’d say that the degree of agreement on “star ratings” before moderation of my co-marker’s grading and mine was at the 70% level, give or take. This is in line with the consistency we observed at Nottingham for independent reviewers in Physics and is therefore, at least, somewhat encouraging. (Other units of assessment for Nottingham’s mock REF review had only 50% agreement.)  But what set my teeth on edge for a not-insignificant number of papers — including quite a few of those on which my gradings agreed with those of my co-marker — was that I simply did not feel at all  qualified to comment.

Even though I’m a condensed matter physicist and we were asked to assess condensed matter physics papers, I simply don’t have the necessary level of hubris to pretend that I can expertly assess any paper in any CMP sub-field. The question that went through my head repeatedly was “If I got this paper from Physical Review Letters (or Phys. Rev. B, or Nature, or Nature Comms, or Advanced Materials, or J. Phys. Chem. C…etc…) would I accept the reviewing invitation or would I decline, telling them it was out of my field of expertise?”  And for the majority of papers the answer to that question was a resounding “I’d decline the invitation.”

So if a paper I was asked to review wasn’t in my (sub-)field of expertise, how did I gauge its reception in the relevant scientific community?

I can’t quite believe I’m admitting this, given my severe misgivings about citation metrics, but, yes, I held my nose and turned to Web of Science. And citation metrics also played a role in the decisions my co-marker made, and in our moderation. This, despite the fact that we had no way of normalising those metrics to the prevailing citation culture of each sub-field, nor of ranking the quality as distinct from the impact of each paper. (One of my absolutely favourite papers of all time – a truly elegant and pioneering piece of work – has picked up a surprisingly low number of citations, as compared to much more pedestrian work in the field.)

Only when I had to face a stack of papers and grade them for myself did I realise just how exceptionally difficult it is to pass numerical judgment on a piece of work in an area that lies outside my rather small sphere of research. I was, of course, asked to comment on publications in condensed matter physics, ostensibly my area of expertise. But that’s a huge field. Not only is no-one a world-leading expert in all areas of condensed matter physics, it’s almost impossible to keep up with developments in our own narrow sub-fields of interest let alone be au fait with the state of the art in all other sub-fields.

So we therefore turn to citations to try to gauge the extent to which a paper has made ripples — or perhaps even sent shockwaves – through a sub-field in which we have no expertise. My co-marker and I are hardly alone in adopting this citation-counting strategy. But that’s of course no excuse — we were relying on exactly the type of pseudoquantitative heuristic that I have criticised in the past and I felt rather “grubby” at the end of the (rather tiring) day. David Colquhoun made the following point time and again in the run up to the last REF  (and well before):

All this shows what is obvious to everyone but bone-headed bean counters. The only way to assess the merit of a paper is to ask a selection of experts in the field.

Nothing else works.

Nothing.

Bibliometrics are a measure of visibility and “clout” in a particular (yet often nebulously defined) research community; they’re not a quantification of scientific quality. Therefore, very many scientists, and this most definitely includes me, have deep misgivings about using citations to judge a paper’s — let alone a scientist’s — worth.

Although I agree with that quote from David above, the problem is that we need to somehow choose the correct “boundary conditions” for each expert; I can have a reasonable level of expertise in one sub-area of a field — say, scanning probe microscopy or self-assembly or semiconductor surface physics — and a distinct lack of working knowledge, let alone expertise, in another sub-area of that self-same field. I could list literally hundreds of topics where I would, in fact, be winging it.

For many years, and because of my deep aversion to simplistic citation-counting and bibliometrics, I’ve been guilty of the type of not-particularly-joined-up thinking that Dorothy Bishop rightly chastises in this tweet…

We can’t trust the bibliometrics in isolation (for all the reasons (and others) that David Colquhoun lays out here), so when it comes to the REF the argument is that we have to supplement the metrics with “quality control” via another round of ostensibly expert peer review. But the problem is that it’s often not expert peer review; I was certainly not an expert in the subject areas of very many of the papers I was asked to judge. And I’ll hold that no-one can be a world-leading expert in every sub-field of a given area of physics (or any other discipline).

So what are the alternatives?

David has suggested that we should, in essence, retire what’s known as the “dual support” system for research funding (see the video embedded below): “…abolish the REF, and give the money to research councils, with precautions to prevent people being fired because their research wasn’t expensive enough.” I have quite some sympathy with that view because the common argument that the so-called QR funding awarded via the REF is used to support “unpopular” areas of research that wouldn’t necessarily be supported by the research councils is not at all compelling (to put it mildly). Universities demonstrably align their funding priorities and programmes very closely with research council strategic areas; they don’t hand out QR money for research that doesn’t fall within their latest Universal Targetified Globalised Research Themes.

Prof. Bishop has a different suggestion for revamping how QR funding is divvied up, which initially (and naively, for the reasons outlined above) I found a little unsettling. My first-hand experience earlier this week with the publication grading methodology used by the REF — albeit in a mock assessment — has made me significantly more comfortable with Dorothy’s strategy:

.”..dispense with the review of quality, and you can obtain similar outcomes by allocating funding at institutional level in relation to research volume”.

Given that grant income is often taken as yet another proxy for research quality, and that there’s a clear Matthew effect (rightly or wrongly) at play in science funding, this correlation between research volume and REF placement is not surprising. As the Times Higher Education article on Dorothy’s proposals went on to quote,

The government should, therefore, consider allocating block funding in proportion to the number of research-active staff at a university because that would shrink the burden on universities and reduce perverse incentives in the system, [Prof Bishop] said.

Before reacting strongly one way or another, I strongly recommend that you take the time to listen to Prof. Bishop eloquently detail her arguments in the video below.

Here’s the final slide of that presentation:

DorothyBishopRecommendations

So much rests on that final point. Ultimately, the immense time and effort devoted to/wasted on the REF boils down to a lack of trust — by government, funding bodies, and, depressingly, often university senior management — that academics cannot motivate themselves without perverse incentives like aiming for a 4* paper. That would be bad enough if we all could agree on what a 4* paper looks like…

Spinning off without IP?

I’ve had the exceptionally good fortune of working with a considerable number of extremely talented, tenacious, and insightful scientists over the years. One of those was Julian Stirling, whose PhD I ostensibly supervised. (In reality, Julian spent quite some time supervising me.) Julian is now a postdoctoral researcher at the University of Bath and is involved in a number of exciting projects there (and elsewhere), including that he describes in the guest post below. Over to you Julian…


Universities love spin-offs — they show that research has had impact! — but does the tax payer or the scientific community get good value for money? More importantly, does spinning off help or hurt the research? I fall strongly on the side of arguing that it hurts. Perhaps I am ideologically driven in my support for openness, but when it comes to building scientific instruments I think I have a strong case.

Imagine a scientist has a great idea for a new instrument. It takes three years to build it, and the results are amazing; it revolutionises the field. The scientist will be encouraged by funding bodies to make the research open. Alongside the flashy science papers will probably be a pretty dry paper on the concept of the instrument; these will be openly published. However, there will be no technical drawings, no control software, no warnings to “Never assemble X before Y or all your data will be wrong and you will only find out 3 months later!“. The university and funding agencies will want all of this key information to be held as intellectual property by a spin-off company. This company will then sell instruments to scientists (many funded by the same source that paid for the development).

The real problem comes when two more scientists both have great new ideas which require a sightly modified version of the instrument. Unfortunately, as the plans are not available, both their groups must spend 2-3 years reinventing the wheel for their own design just so they can add a new feature. Inevitably both new instruments get spun off. Very soon, the tax payer has paid for the instrument to be developed three times; a huge amount of time has been put into duplicating effort. And, very probably, the spin-off companies will get into legal battles over intellectual property. This pushes the price of the instruments up as their lawyers get rich. I have ranted about this so many times there is even a cartoon of my rant…

Julian.png

We live in a time when governments are requiring scientific publications to be open access. We live in a world where open source software is so stable and powerful it runs most web-servers, most phones, and all 500 of the worlds fastest supercomputers. Why can’t science hardware be open too? There is a growing movement to do just that, but it is somewhat hampered by people conflating open source hardware and low-cost hardware. If science is going to progress, we should share as much knowledge as possible.

In January 2018 I was very lucky to get a post-doctoral position working on open source hardware at the University of Bath. I became part of the OpenFlexure Microscope project, an open-source laboratory-grade motorised 3D-printed microscope. What most people don’t realise about microscopes is that the majority of the design work goes into working out how to precisely position a sample so you can find and focus on the interesting parts. The OpenFlexure microscope is lower cost than most microscopes due to 3D printing, but this has not been done by just 3D printing the same shapes you would normally machine from metal. That would produce an awful microscope. Instead, the main microscope stage is one single complex piece that only a 3D printer could make.  Rather than sliding fine-ground metal components, the flexibility of plastic is used to create a number of flexure hinges. The result is a high performance microscope which is undergoing trials for malaria diagnosis in Tanzania.

ResearchPartners.jpg

But what about production? A key benefit of the microscope being open is that local companies in regions that desperately need more microscopes can build them for their communities. This creates local industry and lowers initial costs, but, most importantly, it guarantees that local engineers can fix the equipment. Time and time again well-meaning groups send expensive scientific equipment into low resource settings with no consideration of how it performs in those conditions nor any plans for how it can be fixed when problems do arise. For these reasons the research project has a Tanzanian partner, STICLab, who are building (and will soon be selling) microscopes in Tanzania. We hope that other companies in other locations will start to do the same.

The research project had plans to support distributed manufacturing abroad. But what if people in the UK want a microscope? They can always build their own — but this requires time, effort, and a 3D printer. For this reason, Richard Bowman (the creator of OpenFlexure Microscope) and I started our own company, OpenFlexure Industries, to distribute microscopes. Technically, it is not a spin-off as it owns no intellectual property. We hope to show that scientific instruments can be distributed by successful businesses, while the entire project remains open.

People ask me “How do you stop another company undercutting you and selling them for less?” The answer is: we don’t. We want people to have microscopes, if someone undercuts us we achieved this goal. The taxpayer rented Richard’s brain when they gave him the funding to develop the microscope, and now everyone owns the design.

The company is only a month old, but we are happy to have been nominated for a Great West Business Award. If you support the cause of open source hardware and distributed manufacturing we would love your vote.

Bullshit and Beyond: From Chopra to Peterson

Harry G Frankfurt‘s On Bullshit is a modern classic. He highlights the style-over-substance tenor of the most fragrant and flagrant bullshit, arguing that

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says
only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye
is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

In other words, the bullshitter doesn’t care about the validity or rigour of their arguments. They are much more concerned with being persuasive. One aspect of BS that doesn’t quite get the attention it deserves in Frankfurt’s essay, however, is that special blend of obscurantism and vacuity that is the hallmark of three world-leading bullshitters of our time:  Deepak Chopra, Karen Barad (see my colleague Brigitte Nerlich’s important discussion of Barad’s wilfully impenetrable language here), and Jordan Peterson. In a talk for the University of Nottingham Agnostic, Secularist, and Humanist Society last night (see here for the blurb/advert), I focussed on the intriguing parallels between their writing and oratory. Here’s the video of the talk.

Thanks to UNASH for the invitation. I’ve not included the lengthy Q&A that followed (because I stupidly didn’t ask for permission to film audience members’ questions). I’m hoping that some discussion and debate might ensue in the comments section below. If you do dive in, try not to bullshit too much…

 

 

LIYSF 2018: Science Without Borders*

Better the pride that resides
In a citizen of the world
Than the pride that divides
When a colourful rag is unfurled

From Territories. Track 5 of Rush’s Power Windows (1985). Lyrics: Neil Peart.


LIYSF.JPG

Last night I had the immense pleasure and privilege of giving a plenary lecture for the London International Youth Science Forum. 2018 marks the 60th annual forum, a two-week event that brings together 500 students (aged 16 – 21) from, this year, seventy different countries…

LIYSF_countries.jpg

The history of the forum is fascinating. Embarrassingly, until I received the invitation to speak I was unaware of the LIYSF’s impressive and exciting efforts over many decades to foster and promote, in parallel, science education and international connections. The “science is global” message is at the core of the Forum’s ethos, as described at the LIYSF website:

The London International Youth Science Forum was the brainchild of the late Philip S Green. In the aftermath of the Second World War an organisation was founded in Europe by representatives from Denmark, Czech Republic, the Netherlands and the United Kingdom in an effort to overcome the animosity resulting from the war. Plans were made to set up group home-to-home exchanges between schools and communities in European countries. This functioned with considerable success and in 1959 Philip Green decided to provide a coordinated programme for groups from half a dozen European countries and, following the belief that ‘out of like interests the strongest friendships grow.’ He based the programme on science.

The printed programme for LIYSF 2018 includes a message from the Prime Minster…

MayLIYSF.JPG

It’s a great shame that the PM’s message above doesn’t mention at all LIYSF’s work in breaking down borders and barriers between scientists in different countries since its inception in 1959. But given that her government and her political party have been responsible for driving the appalling isolationism and, in its worst excesses, xenophobia of Brexit, it’s not at all surprising that she might want to gloss over that aspect of the Forum…

The other slightly irksome aspect of May’s message, and something I attempted to counter during the lecture last night, is the focus on “demand for STEM skills”, as if non-STEM subjects were somehow of intrinsically less value. Yes, I appreciate that it’s a science forum, and, yes, I appreciate that the LIYSF students are largely focussed on careers in science and engineering. But we need to encourage a greater appreciation of the value of non-STEM subjects. I, for one, was torn between opting to do an English or a physics degree at university. As I’ve banged on about previously, the A-level system frustratingly tends to exacerbate this artificial “two cultures” divide between STEM subjects and the arts and humanities. We need science and maths. And we need economics, philosophy, sociology, English lit, history, geography, modern (and not-so-modern) languages…

The arrogance of a certain breed of STEM student (or researcher or lecturer) who thinks that the ability to do complicated maths is the pinnacle of intellectual achievement also helps to drive this wedge between the disciplines. And yet those particular students, accomplished though they may well be in vector calculus, contour integration, and/or solving partial differential equations, often flounder completely when asked to write five-hundred words that are reasonably engaging and/or entertaining.

Borders and boundaries, be they national or disciplinary, encourage small-minded, insular thinking. Encouragingly, there was none of that on display last night. After the hour-long lecture, I was blown away, time and again, by the intelligent, perceptive, and, at times, provocative (in a very good way!) questions from the LIYSF students. After an hour and half of questions, security had to kick us out of the theatre because it was time to lock up.

Clare Elwell, who visited Nottingham last year to give a fascinating and inspirational Masterclass lecture on her ground-breaking research for our Physics & Astronomy students, is the President of the LIYSF. It’s no exaggeration to say that the impact of the LIYSF on Clare’s future, when she attended as a student, was immense. I’ll let Clare explain:

 I know how impactful and inspiring these experiences can be, as I attended the Forum myself as a student over thirty years ago. It was here that I was first introduced to Medical Physics – an area of science which I have pursued as a career ever since. Importantly, the Forum also opened my eyes to the power of collaboration and communication across scientific disciplines and national borders to address global challenges — something which has formed a key element of my journey in science, and which the world needs now more than ever.

(That quote is also taken from the LIYSF 2018 Programme.)

My lecture was entitled “Bit from It: Manipulating matter bond by bond”“. A number of students asked whether I’d make the slides available, which, of course, is my pleasure (via that preceding link). In addition, some students asked about the physics underpinning the “atomic force macroscope [1]” (and the parallels with its atomic force microscope counterpart) that I used as a demonstration in the talk:

IMG_4682.JPG

(Yes, the coffee is indeed an integral component of the experimental set-up [2]).

Unfortunately, due to the size of the theatre only a small number of the students could really see the ‘guts’ of the “macroscope”. I’m therefore going to write a dedicated post in the not-too-distant future on just how it works, its connections to atomic force microscopy, and its much more advanced sibling the LEGOscope (the result of a third year undergraduate project carried out by two very talented students).

The LIYSF is a huge undertaking and it’s driven by the hard work and dedication of a wonderful team of people. I’ve got to say a big thank you to those of that team I met last night and who made my time at LIYSF so very memorable: Director Richard Myhill for the invitation (and Clare (Elwell) for the recommendation) and for sorting out all of the logistics of my visit; Sam Thomas and Simran Mohnani, Programme Liaison; Rhia Patel and Vilius Uksas, Engagement Manager and Videographer, respectively. (It’s Vilius you can see with the camera pointed in my direction in the photo at the top there.); Victoria Sciandro (Deputy Host. Victoria also had the task of summarising my characteristically rambling lecture before the Q&A session started and did an exceptional job, given the incoherence of the source material); and James, whose surname I’ve embarrassingly forgotten but who was responsible for all of the audio-video requirements, the sound and the lighting. He did an exceptional job. Thank you, James. (I really hope I’ve not forgotten anyone. If I have, my sincere apologies.)

Although this was my first time at the LIYSF, I sincerely hope it won’t be my last. It was a genuinely inspiring experience to spend time with such enthusiastic and engaging students. The future of science is in safe hands.

We opened the post with Rush. So let’s bring things full circle and close with that Toronto trio… [3]


* “Science Without Borders” is also the name of the agency that funds the PhD research of Filipe Junquiera in the Nottingham Nanoscience Group. As this blog post on Filipe’s journey to Nottingham describes, he’s certainly crossed borders.

[1] Thanks to my colleague Chris Mellor for coining the “atomic force macroscope” term.

[2] It’s not. (The tiresome literal-mindedness of some online never ceases to amaze me. Best to be safe than sorry.)

[3] Great to be asked a question from the floor by a fellow Rush fan last night. And he was Canadian to boot!

In Praise of ‘Small Astronomy’

My colleague and friend, Mike Merrifield, wrote the following thought-provoking post, recently featured at the University of Nottingham blog. I’m reposting it here at “Symptoms…” because although I’m not an astronomer, Mike’s points regarding big vs small science are also pertinent to my field of research: condensed matter physics/ nanoscience. Small research teams have made huge contributions in these areas over the years; many of the pioneering, ground-breaking advances in single atom/molecule imaging and manipulation have come from teams of no more than three or four researchers. Yet there’s a frustrating and troublesome mindset — especially among those who hold the purse strings at universities and funding bodies — that “small science” is outmoded and so last century. Much better to spend funding on huge multi-investigator teams with associated shiny new research institutes, apparently.

That’s enough from me. Over to Mike…


A number of years back, I had the great privilege of interviewing the Dutch astronomer Adriaan Blaauw for a TV programme.  He must have been well into his eighties at the time, but was still cycling into work every day at the University of Leiden, and had fascinating stories to tell about the very literal perils of trying to undertake astronomical research under Nazi occupation; the early days of the European Southern Observatory (ESO) of which he was one of the founding figures; and his involvement with the Hipparcos satellite, which had just finished gathering data on the exact positions of a million stars to map out the structure of the Milky Way.

When the camera stopped rolling and we were exchanging wind-down pleasantries, I was taken aback when Professor Blaauw suddenly launched into a passionate critique of big science projects like the very one we had been discussing.  He was very concerned that astronomy had lost its way, and rather than thinking in any depth about what new experiments we should be doing, we kept simply pursuing more and more data.  His view was that all we would do with data sets like that produced by Hipparcos would be to skim off the cream and then turn our attention to the next bigger and better mission rather than investing the time and effort needed to exploit these data properly.  With technology advancing at such a rapid pace, this pressure will always be there – why work hard for many months to optimise the exploitation of this year’s high-performance computers, when next year’s will be able to do the same task as a trivial computation?  Indeed, the Hipparcos catalogue of a million stars is even now in the process of being superseded by the Gaia mission making even higher quality measurements of a billion stars.

Of course there are two sides to this argument.  Some science simply requires the biggest and the best.  Particle physicists, for example, need ever-larger machines to explore higher energy regimes to probe new areas of fundamental physics.  And some results can only be obtained through the collection of huge amounts of data to find the rare phenomena that are buried in such an avalanche, and to build up statistics to a point where conclusions become definitive.  This approach has worked very well in astronomy, where collaborations such as the Sloan Digital Sky Survey (SDSS) have brought together thousands of researchers to work on projects on a scale that none could undertake individually.  Such projects have also democratized research in that although the data from surveys such as SDSS are initially reserved for the participants who have helped pay for the projects, the proprietary period is usually quite short so the data are available to anyone in the World with internet access to explore and publish their own findings.

Unfortunately, there is a huge price to pay for these data riches. First, there is definitely some truth in Blaauw’s critique, with astronomers behaving increasingly like magpies, drawn to the shiniest bauble in the newest, biggest data set.  This tendency is amplified by the funding of research, where the short proprietary period on such data means that those who are “on the team” have a cast iron case as to why their grant should be funded this round, because by next round anyone in the World could have done the analysis.  And of course by the time the next funding round comes along there is a new array of time-limited projects that will continue to squeeze out any smaller programmes or exploitation of older data.

But there are other problems that are potentially even more damaging to this whole scientific enterprise.  There is a real danger that we simply stop thinking.  If you ask astronomers what they would do with a large allocation of telescope time, most would probably say they would do a survey larger than any other.  It is, after all, a safe option: all those results that were right at the edge of statistical significance will be confirmed (or refuted) by ten times as much data, so we know we will get interesting results.  But is it really the best use of the telescope?  Could we learn more by targeting observations to many much more specific questions, each of which requires a relatively modest investment of time?  This concern also touches on the wider philosophical question of the “right” way to do science.  With a big survey, the temptation is always to correlate umpteen properties of the data with umpteen others until something interesting pops out, then try to explain it.  This a posteriori approach is fraught with difficulty, as making enough plots will always turn up a correlation, and it is then always possible to reverse engineer an explanation for what you have found.  Science progresses in a much more robust (and satisfying) way when the idea comes first, followed by thinking of an experiment that is explicitly targeted to test the hypothesis, and then the thrill of discovering that the Universe behaves as you had predicted (or not!) when you analyse the results of the test.

Finally, and perhaps most damagingly, we are turning out an entire generation of new astronomers who have only ever worked on mining such big data sets.  As PhD students, they will have been small cogs in the massive machines that drive these big surveys forward, so the chances of them having their names associated with any exciting results are rather small – not unreasonably, those who may have invested most of a career in getting the survey off the ground will feel they have first call on any such headlines.  The students will also have never seen a project all the way through from first idea on the back of a beer mat through telescope proposals, observations, analysis, write-up and publication.  Without that overview of the scientific process on the modest scale of a PhD project, they will surely be ill prepared for taking on leadership roles on bigger projects further down the line.

I suppose it all comes down to a question of balance: there are some scientific results that would simply be forever inaccessible without large-scale surveys, but we have to somehow protect the smaller-scale operations that can produce some of the most innovative results, while also helping to keep the whole endeavour on track.  At the moment, we seem to be very far from that balance point, and are instead playing out Adriaan Blaauw’s nightmare.

Politics. Perception. Philosophy. And Physics.

Today is the start of the new academic year at the University of Nottingham (UoN) and, as ever, it crept up on me and then leapt out with a fulsome “Gotcha”. Summer flies by so very quickly. I’ll be meeting my new 1st year tutees this afternoon to sort out when we’re going to have tutorials and, of course, to get to know them. One of the great things about the academic life is watching tutees progress over the course of their degree from that first “getting to know each other” meeting to when they graduate.

The UoN has introduced a considerable number of changes to the “student experience” of late via its Project Transform process. I’ve vented my spleen about this previously but it’s a subject to which I’ll be returning in the coming weeks because Transform says an awful lot about the state of modern universities.

For now, I’m preparing for a module entitled “The Politics, Perception and Philosophy of Physics” (F34PPP) that I run in the autumn semester. This is a somewhat untraditional physics module because, for one thing, it’s almost entirely devoid of mathematics. I thoroughly enjoy  F34PPP each year (despite this amathematical heresy) because of the engagement and enthusiasm of the students. The module is very much based on their contributions — I am more of a mediator than a lecturer.

STEM students are sometimes criticised (usually by Simon Jenkins) for having poorly developed communication skills. This is an especially irritating stereotype in the context of the PPP module, where I have been deeply impressed by the quality of the writing the students submit. As I discuss in the video below (an  overview of the module), I’m not alone in recognising this: articles submitted as F34PPP coursework have been published in Physics World, the flagship magazine of the Institute of Physics.

 

In the video I note that my intention is to upload a weekly video for each session of the module. I’m going to do my utmost to keep this promise and, moreover, to accompany each of those videos with a short(ish) blog post. (But, to cover my back, I’ll just note in advance that the best laid schemes gang aft agley…)