Spinning off without IP?

I’ve had the exceptionally good fortune of working with a considerable number of extremely talented, tenacious, and insightful scientists over the years. One of those was Julian Stirling, whose PhD I ostensibly supervised. (In reality, Julian spent quite some time supervising me.) Julian is now a postdoctoral researcher at the University of Bath and is involved in a number of exciting projects there (and elsewhere), including that he describes in the guest post below. Over to you Julian…


Universities love spin-offs — they show that research has had impact! — but does the tax payer or the scientific community get good value for money? More importantly, does spinning off help or hurt the research? I fall strongly on the side of arguing that it hurts. Perhaps I am ideologically driven in my support for openness, but when it comes to building scientific instruments I think I have a strong case.

Imagine a scientist has a great idea for a new instrument. It takes three years to build it, and the results are amazing; it revolutionises the field. The scientist will be encouraged by funding bodies to make the research open. Alongside the flashy science papers will probably be a pretty dry paper on the concept of the instrument; these will be openly published. However, there will be no technical drawings, no control software, no warnings to “Never assemble X before Y or all your data will be wrong and you will only find out 3 months later!“. The university and funding agencies will want all of this key information to be held as intellectual property by a spin-off company. This company will then sell instruments to scientists (many funded by the same source that paid for the development).

The real problem comes when two more scientists both have great new ideas which require a sightly modified version of the instrument. Unfortunately, as the plans are not available, both their groups must spend 2-3 years reinventing the wheel for their own design just so they can add a new feature. Inevitably both new instruments get spun off. Very soon, the tax payer has paid for the instrument to be developed three times; a huge amount of time has been put into duplicating effort. And, very probably, the spin-off companies will get into legal battles over intellectual property. This pushes the price of the instruments up as their lawyers get rich. I have ranted about this so many times there is even a cartoon of my rant…

Julian.png

We live in a time when governments are requiring scientific publications to be open access. We live in a world where open source software is so stable and powerful it runs most web-servers, most phones, and all 500 of the worlds fastest supercomputers. Why can’t science hardware be open too? There is a growing movement to do just that, but it is somewhat hampered by people conflating open source hardware and low-cost hardware. If science is going to progress, we should share as much knowledge as possible.

In January 2018 I was very lucky to get a post-doctoral position working on open source hardware at the University of Bath. I became part of the OpenFlexure Microscope project, an open-source laboratory-grade motorised 3D-printed microscope. What most people don’t realise about microscopes is that the majority of the design work goes into working out how to precisely position a sample so you can find and focus on the interesting parts. The OpenFlexure microscope is lower cost than most microscopes due to 3D printing, but this has not been done by just 3D printing the same shapes you would normally machine from metal. That would produce an awful microscope. Instead, the main microscope stage is one single complex piece that only a 3D printer could make.  Rather than sliding fine-ground metal components, the flexibility of plastic is used to create a number of flexure hinges. The result is a high performance microscope which is undergoing trials for malaria diagnosis in Tanzania.

ResearchPartners.jpg

But what about production? A key benefit of the microscope being open is that local companies in regions that desperately need more microscopes can build them for their communities. This creates local industry and lowers initial costs, but, most importantly, it guarantees that local engineers can fix the equipment. Time and time again well-meaning groups send expensive scientific equipment into low resource settings with no consideration of how it performs in those conditions nor any plans for how it can be fixed when problems do arise. For these reasons the research project has a Tanzanian partner, STICLab, who are building (and will soon be selling) microscopes in Tanzania. We hope that other companies in other locations will start to do the same.

The research project had plans to support distributed manufacturing abroad. But what if people in the UK want a microscope? They can always build their own — but this requires time, effort, and a 3D printer. For this reason, Richard Bowman (the creator of OpenFlexure Microscope) and I started our own company, OpenFlexure Industries, to distribute microscopes. Technically, it is not a spin-off as it owns no intellectual property. We hope to show that scientific instruments can be distributed by successful businesses, while the entire project remains open.

People ask me “How do you stop another company undercutting you and selling them for less?” The answer is: we don’t. We want people to have microscopes, if someone undercuts us we achieved this goal. The taxpayer rented Richard’s brain when they gave him the funding to develop the microscope, and now everyone owns the design.

The company is only a month old, but we are happy to have been nominated for a Great West Business Award. If you support the cause of open source hardware and distributed manufacturing we would love your vote.

Bullshit and Beyond: From Chopra to Peterson

Harry G Frankfurt‘s On Bullshit is a modern classic. He highlights the style-over-substance tenor of the most fragrant and flagrant bullshit, arguing that

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says
only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye
is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

In other words, the bullshitter doesn’t care about the validity or rigour of their arguments. They are much more concerned with being persuasive. One aspect of BS that doesn’t quite get the attention it deserves in Frankfurt’s essay, however, is that special blend of obscurantism and vacuity that is the hallmark of three world-leading bullshitters of our time:  Deepak Chopra, Karen Barad (see my colleague Brigitte Nerlich’s important discussion of Barad’s wilfully impenetrable language here), and Jordan Peterson. In a talk for the University of Nottingham Agnostic, Secularist, and Humanist Society last night (see here for the blurb/advert), I focussed on the intriguing parallels between their writing and oratory. Here’s the video of the talk.

Thanks to UNASH for the invitation. I’ve not included the lengthy Q&A that followed (because I stupidly didn’t ask for permission to film audience members’ questions). I’m hoping that some discussion and debate might ensue in the comments section below. If you do dive in, try not to bullshit too much…

 

 

LIYSF 2018: Science Without Borders*

Better the pride that resides
In a citizen of the world
Than the pride that divides
When a colourful rag is unfurled

From Territories. Track 5 of Rush’s Power Windows (1985). Lyrics: Neil Peart.


LIYSF.JPG

Last night I had the immense pleasure and privilege of giving a plenary lecture for the London International Youth Science Forum. 2018 marks the 60th annual forum, a two-week event that brings together 500 students (aged 16 – 21) from, this year, seventy different countries…

LIYSF_countries.jpg

The history of the forum is fascinating. Embarrassingly, until I received the invitation to speak I was unaware of the LIYSF’s impressive and exciting efforts over many decades to foster and promote, in parallel, science education and international connections. The “science is global” message is at the core of the Forum’s ethos, as described at the LIYSF website:

The London International Youth Science Forum was the brainchild of the late Philip S Green. In the aftermath of the Second World War an organisation was founded in Europe by representatives from Denmark, Czech Republic, the Netherlands and the United Kingdom in an effort to overcome the animosity resulting from the war. Plans were made to set up group home-to-home exchanges between schools and communities in European countries. This functioned with considerable success and in 1959 Philip Green decided to provide a coordinated programme for groups from half a dozen European countries and, following the belief that ‘out of like interests the strongest friendships grow.’ He based the programme on science.

The printed programme for LIYSF 2018 includes a message from the Prime Minster…

MayLIYSF.JPG

It’s a great shame that the PM’s message above doesn’t mention at all LIYSF’s work in breaking down borders and barriers between scientists in different countries since its inception in 1959. But given that her government and her political party have been responsible for driving the appalling isolationism and, in its worst excesses, xenophobia of Brexit, it’s not at all surprising that she might want to gloss over that aspect of the Forum…

The other slightly irksome aspect of May’s message, and something I attempted to counter during the lecture last night, is the focus on “demand for STEM skills”, as if non-STEM subjects were somehow of intrinsically less value. Yes, I appreciate that it’s a science forum, and, yes, I appreciate that the LIYSF students are largely focussed on careers in science and engineering. But we need to encourage a greater appreciation of the value of non-STEM subjects. I, for one, was torn between opting to do an English or a physics degree at university. As I’ve banged on about previously, the A-level system frustratingly tends to exacerbate this artificial “two cultures” divide between STEM subjects and the arts and humanities. We need science and maths. And we need economics, philosophy, sociology, English lit, history, geography, modern (and not-so-modern) languages…

The arrogance of a certain breed of STEM student (or researcher or lecturer) who thinks that the ability to do complicated maths is the pinnacle of intellectual achievement also helps to drive this wedge between the disciplines. And yet those particular students, accomplished though they may well be in vector calculus, contour integration, and/or solving partial differential equations, often flounder completely when asked to write five-hundred words that are reasonably engaging and/or entertaining.

Borders and boundaries, be they national or disciplinary, encourage small-minded, insular thinking. Encouragingly, there was none of that on display last night. After the hour-long lecture, I was blown away, time and again, by the intelligent, perceptive, and, at times, provocative (in a very good way!) questions from the LIYSF students. After an hour and half of questions, security had to kick us out of the theatre because it was time to lock up.

Clare Elwell, who visited Nottingham last year to give a fascinating and inspirational Masterclass lecture on her ground-breaking research for our Physics & Astronomy students, is the President of the LIYSF. It’s no exaggeration to say that the impact of the LIYSF on Clare’s future, when she attended as a student, was immense. I’ll let Clare explain:

 I know how impactful and inspiring these experiences can be, as I attended the Forum myself as a student over thirty years ago. It was here that I was first introduced to Medical Physics – an area of science which I have pursued as a career ever since. Importantly, the Forum also opened my eyes to the power of collaboration and communication across scientific disciplines and national borders to address global challenges — something which has formed a key element of my journey in science, and which the world needs now more than ever.

(That quote is also taken from the LIYSF 2018 Programme.)

My lecture was entitled “Bit from It: Manipulating matter bond by bond”“. A number of students asked whether I’d make the slides available, which, of course, is my pleasure (via that preceding link). In addition, some students asked about the physics underpinning the “atomic force macroscope [1]” (and the parallels with its atomic force microscope counterpart) that I used as a demonstration in the talk:

IMG_4682.JPG

(Yes, the coffee is indeed an integral component of the experimental set-up [2]).

Unfortunately, due to the size of the theatre only a small number of the students could really see the ‘guts’ of the “macroscope”. I’m therefore going to write a dedicated post in the not-too-distant future on just how it works, its connections to atomic force microscopy, and its much more advanced sibling the LEGOscope (the result of a third year undergraduate project carried out by two very talented students).

The LIYSF is a huge undertaking and it’s driven by the hard work and dedication of a wonderful team of people. I’ve got to say a big thank you to those of that team I met last night and who made my time at LIYSF so very memorable: Director Richard Myhill for the invitation (and Clare (Elwell) for the recommendation) and for sorting out all of the logistics of my visit; Sam Thomas and Simran Mohnani, Programme Liaison; Rhia Patel and Vilius Uksas, Engagement Manager and Videographer, respectively. (It’s Vilius you can see with the camera pointed in my direction in the photo at the top there.); Victoria Sciandro (Deputy Host. Victoria also had the task of summarising my characteristically rambling lecture before the Q&A session started and did an exceptional job, given the incoherence of the source material); and James, whose surname I’ve embarrassingly forgotten but who was responsible for all of the audio-video requirements, the sound and the lighting. He did an exceptional job. Thank you, James. (I really hope I’ve not forgotten anyone. If I have, my sincere apologies.)

Although this was my first time at the LIYSF, I sincerely hope it won’t be my last. It was a genuinely inspiring experience to spend time with such enthusiastic and engaging students. The future of science is in safe hands.

We opened the post with Rush. So let’s bring things full circle and close with that Toronto trio… [3]


* “Science Without Borders” is also the name of the agency that funds the PhD research of Filipe Junquiera in the Nottingham Nanoscience Group. As this blog post on Filipe’s journey to Nottingham describes, he’s certainly crossed borders.

[1] Thanks to my colleague Chris Mellor for coining the “atomic force macroscope” term.

[2] It’s not. (The tiresome literal-mindedness of some online never ceases to amaze me. Best to be safe than sorry.)

[3] Great to be asked a question from the floor by a fellow Rush fan last night. And he was Canadian to boot!

In Praise of ‘Small Astronomy’

My colleague and friend, Mike Merrifield, wrote the following thought-provoking post, recently featured at the University of Nottingham blog. I’m reposting it here at “Symptoms…” because although I’m not an astronomer, Mike’s points regarding big vs small science are also pertinent to my field of research: condensed matter physics/ nanoscience. Small research teams have made huge contributions in these areas over the years; many of the pioneering, ground-breaking advances in single atom/molecule imaging and manipulation have come from teams of no more than three or four researchers. Yet there’s a frustrating and troublesome mindset — especially among those who hold the purse strings at universities and funding bodies — that “small science” is outmoded and so last century. Much better to spend funding on huge multi-investigator teams with associated shiny new research institutes, apparently.

That’s enough from me. Over to Mike…


A number of years back, I had the great privilege of interviewing the Dutch astronomer Adriaan Blaauw for a TV programme.  He must have been well into his eighties at the time, but was still cycling into work every day at the University of Leiden, and had fascinating stories to tell about the very literal perils of trying to undertake astronomical research under Nazi occupation; the early days of the European Southern Observatory (ESO) of which he was one of the founding figures; and his involvement with the Hipparcos satellite, which had just finished gathering data on the exact positions of a million stars to map out the structure of the Milky Way.

When the camera stopped rolling and we were exchanging wind-down pleasantries, I was taken aback when Professor Blaauw suddenly launched into a passionate critique of big science projects like the very one we had been discussing.  He was very concerned that astronomy had lost its way, and rather than thinking in any depth about what new experiments we should be doing, we kept simply pursuing more and more data.  His view was that all we would do with data sets like that produced by Hipparcos would be to skim off the cream and then turn our attention to the next bigger and better mission rather than investing the time and effort needed to exploit these data properly.  With technology advancing at such a rapid pace, this pressure will always be there – why work hard for many months to optimise the exploitation of this year’s high-performance computers, when next year’s will be able to do the same task as a trivial computation?  Indeed, the Hipparcos catalogue of a million stars is even now in the process of being superseded by the Gaia mission making even higher quality measurements of a billion stars.

Of course there are two sides to this argument.  Some science simply requires the biggest and the best.  Particle physicists, for example, need ever-larger machines to explore higher energy regimes to probe new areas of fundamental physics.  And some results can only be obtained through the collection of huge amounts of data to find the rare phenomena that are buried in such an avalanche, and to build up statistics to a point where conclusions become definitive.  This approach has worked very well in astronomy, where collaborations such as the Sloan Digital Sky Survey (SDSS) have brought together thousands of researchers to work on projects on a scale that none could undertake individually.  Such projects have also democratized research in that although the data from surveys such as SDSS are initially reserved for the participants who have helped pay for the projects, the proprietary period is usually quite short so the data are available to anyone in the World with internet access to explore and publish their own findings.

Unfortunately, there is a huge price to pay for these data riches. First, there is definitely some truth in Blaauw’s critique, with astronomers behaving increasingly like magpies, drawn to the shiniest bauble in the newest, biggest data set.  This tendency is amplified by the funding of research, where the short proprietary period on such data means that those who are “on the team” have a cast iron case as to why their grant should be funded this round, because by next round anyone in the World could have done the analysis.  And of course by the time the next funding round comes along there is a new array of time-limited projects that will continue to squeeze out any smaller programmes or exploitation of older data.

But there are other problems that are potentially even more damaging to this whole scientific enterprise.  There is a real danger that we simply stop thinking.  If you ask astronomers what they would do with a large allocation of telescope time, most would probably say they would do a survey larger than any other.  It is, after all, a safe option: all those results that were right at the edge of statistical significance will be confirmed (or refuted) by ten times as much data, so we know we will get interesting results.  But is it really the best use of the telescope?  Could we learn more by targeting observations to many much more specific questions, each of which requires a relatively modest investment of time?  This concern also touches on the wider philosophical question of the “right” way to do science.  With a big survey, the temptation is always to correlate umpteen properties of the data with umpteen others until something interesting pops out, then try to explain it.  This a posteriori approach is fraught with difficulty, as making enough plots will always turn up a correlation, and it is then always possible to reverse engineer an explanation for what you have found.  Science progresses in a much more robust (and satisfying) way when the idea comes first, followed by thinking of an experiment that is explicitly targeted to test the hypothesis, and then the thrill of discovering that the Universe behaves as you had predicted (or not!) when you analyse the results of the test.

Finally, and perhaps most damagingly, we are turning out an entire generation of new astronomers who have only ever worked on mining such big data sets.  As PhD students, they will have been small cogs in the massive machines that drive these big surveys forward, so the chances of them having their names associated with any exciting results are rather small – not unreasonably, those who may have invested most of a career in getting the survey off the ground will feel they have first call on any such headlines.  The students will also have never seen a project all the way through from first idea on the back of a beer mat through telescope proposals, observations, analysis, write-up and publication.  Without that overview of the scientific process on the modest scale of a PhD project, they will surely be ill prepared for taking on leadership roles on bigger projects further down the line.

I suppose it all comes down to a question of balance: there are some scientific results that would simply be forever inaccessible without large-scale surveys, but we have to somehow protect the smaller-scale operations that can produce some of the most innovative results, while also helping to keep the whole endeavour on track.  At the moment, we seem to be very far from that balance point, and are instead playing out Adriaan Blaauw’s nightmare.

Politics. Perception. Philosophy. And Physics.

Today is the start of the new academic year at the University of Nottingham (UoN) and, as ever, it crept up on me and then leapt out with a fulsome “Gotcha”. Summer flies by so very quickly. I’ll be meeting my new 1st year tutees this afternoon to sort out when we’re going to have tutorials and, of course, to get to know them. One of the great things about the academic life is watching tutees progress over the course of their degree from that first “getting to know each other” meeting to when they graduate.

The UoN has introduced a considerable number of changes to the “student experience” of late via its Project Transform process. I’ve vented my spleen about this previously but it’s a subject to which I’ll be returning in the coming weeks because Transform says an awful lot about the state of modern universities.

For now, I’m preparing for a module entitled “The Politics, Perception and Philosophy of Physics” (F34PPP) that I run in the autumn semester. This is a somewhat untraditional physics module because, for one thing, it’s almost entirely devoid of mathematics. I thoroughly enjoy  F34PPP each year (despite this amathematical heresy) because of the engagement and enthusiasm of the students. The module is very much based on their contributions — I am more of a mediator than a lecturer.

STEM students are sometimes criticised (usually by Simon Jenkins) for having poorly developed communication skills. This is an especially irritating stereotype in the context of the PPP module, where I have been deeply impressed by the quality of the writing the students submit. As I discuss in the video below (an  overview of the module), I’m not alone in recognising this: articles submitted as F34PPP coursework have been published in Physics World, the flagship magazine of the Institute of Physics.

 

In the video I note that my intention is to upload a weekly video for each session of the module. I’m going to do my utmost to keep this promise and, moreover, to accompany each of those videos with a short(ish) blog post. (But, to cover my back, I’ll just note in advance that the best laid schemes gang aft agley…)

How universities incentivise academics to short-change the public

Euro Money Coins Loose Change Specie CurrencyThis is going to be a short post (for a change). First, you should read this by David Colquhoun. I’ll wait until you get back. (You should sign the petition as well while you’re over there).

In his usual down-to-earth and incisive style, Colquhoun has said just about everything that needs to be said about the shocking mismanagement of King’s College London.

So why am I writing this post? Well, it’s because KCL is far from alone in using annual grant income as a metric for staff assessment – the practice is rife across the UK higher education sector. For example, the guidance for performance review at Nottingham contains this as one of the assessment standards: “Sustained research income equal to/in excess of Russell Group average for the discipline group”. Nottingham is not going out on a limb here – our Russell Group ‘competitors’ have similar aspirations for their staff.

What’s wrong with that you might ask? Surely it’s your job as an academic to secure research income?

No. My job as an academic is to do high-quality research. Not to ‘secure research income’. It’s all too easy to forget this, particularly as a new lecturer when you’re trying to get a research group established and gain a foothold on the career ladder. (And as a less-new lecturer attempting to tick the boxes for promotion. And as a grizzled old academic aiming to establish ‘critical mass’ on the national or international research ‘stage’.)

What’s particularly galling, however, is that the annual grant income metric is not normalised to any measure of productivity or quality. So it says nothing about value for money. Time and time again we’re told by the Coalition that in these times of economic austerity, the public sector will have to “do more with less”. That we must maximise efficiency. And yet academics are driven by university management to maximise the amount of funding they can secure from the public pot.

Cost effectiveness doesn’t enter the equation. Literally.

Consider this. A lecturer recently appointed to a UK physics department, Dr. Frugal, secures a modest grant from the Engineering and Physical Sciences Research Council for, say, £200k. She works hard for three years with a sole PhD student and publishes two outstanding papers that revolutionise her field.

Her colleague down the corridor, Prof. Cash, secures a grant for £4M and publishes two solid, but rather less outstanding, papers.

Who is the more cost-effective? Which research project represents better value for money for the taxpayer?

…and which academic will be under greater pressure from management to secure more research income from the public purse?

Image: Coins, the acquistion of which is not university departments’ main aim. Credit: https://www.maxpixel.net/Golden-Gold-Riches-Treasure-Rich-Coins-Bounty-1637722

The laws of physics are undemocratic

shard-1841278_960_720

Yesterday saw the start of the Circling the Square conference at the University of Nottingham. This is a rather unusual meeting which has the lofty aim of bringing together social scientists, those in the arts and humanities, policy ‘wonks’ (for want of a better term), science communicators, and natural scientists (including physicists, of course) to discuss the various interconnected aspects of research, politics, media, and impact.

As one of the conference organisers, I was delighted that the first day featured fascinating keynote lectures, lively discussion, and a rather heated exchange amongst panellists (more on this below). In the afternoon, two of the UK’s most successful science bloggers, David Colquhoun and physicsfocus’s own Athene Donald, gave their thoughts and opinions on the role of new and old media in science communication, debating and discussing the issues with the other panel members – Felicity Mellor and Jon Turney – and a number of contributors from the floor. Andrew Williams’ media keynote lecture preceded the “Researchers facing the media” panel session and was full of important and troublesome insights into just how science can be distorted (for good or bad) through the lens of the media.

But it was the first panel session of the conference, on the science-policy interface, that got me somewhat hot under the collar. (Well, OK, I was wearing a t-shirt so perhaps this isn’t the best metaphor…). That’s because that particular panel provided a telling insight into the gulf that still exists between natural and social scientists when it comes to the interpretation and contextual underpinnings of scientific data. Until we find a way to reconcile views spanning this gulf then we’re going to continue to exist in our silos, as two distinct cultures, arguably even more divided within the sciences than CP Snow could ever have envisaged for our separation from the arts and humanities.

The panel featured a ‘robust’ exchange of views – if you’ll excuse my borrowing of a hoary old euphemism – on the interpretation of scientific data and just how it is used to inform political debate and decisions. Chris Tyler, of the Parliamentary Office of Science and Technology, forcefully put forward his view that we can never consider scientific results in isolation from the political process. Sheila Jasanoff, Professor of Science and Technology Studies at Harvard, had earlier made very similar comments in the light of engaging presentations made by Daniele Fanelli and Beth Taylor on the interface between scientific research and policymaking. The overall tone of the debate is perhaps best summed up in this tweet from Roger Pielke (who is also speaking at the conference today in the “Challenging Established Science” panel):

Fanelli made an impassioned argument countering the idea that scientific evidence must always be considered in the context of its political framing. His comments certainly resonated with me, and I’d be rather surprised if what he said didn’t also strike a chord with the other physical/life scientists in the audience. We spend our lives aiming to do experiments in as disinterested a fashion as possible. It therefore rankles to be told that objective – and I use that word unashamedly – scientific evidence is nothing more than opinion.

For my colleagues in sociology and science and technology studies, I should stress that I am not for one second suggesting that scientists are immune to social biases. John Ziman, physicist-turned-sociologist, rightly disparaged the idea that scientists are always disinterested seekers of the truth, describing it as “the Legend”. Nor am I suggesting that data interpretation is not part and parcel of the scientific method (as Neuroskeptic argues convincingly).

The discussion yesterday, however, dangerously strayed very close at times to the ‘cultural relativism’ that was so successfully lampooned by Alan Sokal back in the nineties. Yes, scientific evidence must be considered as just one element – and, unfortunately, it’s often a very small element – of the political process. It would be naïve, at best, to argue otherwise. But the entire rationale for scientific research is underpinned by the understanding that we, as scientists, should always aim to put aside those socio-political and cultural biases. Otherwise, objective scientific evidence is reduced to pure opinion. Newton’s laws of motion, E=mc2, the Schrödinger equation, the speed of light, and the first and second laws of thermodynamics are not culturally or politically determined. Those same laws are just as valid for a race of small blue furry creatures from Alpha Centauri as they are for us.

Or, as Sokal famously put it,

“…anyone who believes that the laws of physics are mere social conventions is invited to try transgressing those conventions from the windows of my apartment. (I live on the twenty-first floor.)”

Image: The Shard in London, currently the European Union’s tallest building and a prime location to test the idea that the laws of gravity are merely an opinion. Credit: https://www.maxpixel.net/Skyscraper-Shard-Architecture-London-Landmark-949752