Siletewaweqin dess bilognal, Ethiopia

I’m writing this from a room in the Ras Amba hotel in Addis Ababa, Ethiopia having arrived here on Friday morning for the ENTHUSE (ENhancing THe Understanding of PhySics in Ethiopia (ENTHUSE): Student-led Outreach) project. (Please excuse the rather tortured and tortuous route to a memorable acronym for the project title. I’ve clearly written too many European grant proposals…)

ENTHUSE is a project funded largely by the University of Nottingham’s Cascade campaign but involves close collaboration and support from the Institute of Physics. I’m also very grateful to the School of Physics and Astronomy for contributing not only financially but in many other more indirect, though no-less important, ways.  The key objectives and motivations for ENTHUSE are to connect with the physics teaching community in Ethiopia, to share ideas and experience about teaching experimental physics, and, if I can quote my Head of School, to broaden “the experience for the students, involving the development of new teaching materials and demonstrations, and the delivery of that content in a completely different environment, [providing] a truly life-changing opportunity“.

I’m here in Ethiopia for the next week as a member of a team of eight comprising three undergrads (Emma Woods, Jarrod Lewis, and Tiago Goncalves) and one postgrad (Jeremy Leaf) from Physics & Astronomy at Nottingham; Christine Cleave and Bill Poole who are representing the IOP and who have enthusiastically and tirelessly driven the IOP’s Physics in Ethiopia project for the past seven years; and Sean Riley, a film-maker who works closely with Brady Haran and is responsible for the very popular Computerphile series of videos.

My aim is to provide daily updates on our time in Ethiopia via this blog. (I’ll do my best. Promise.) I also hope to upload guest posts from the students involved with ENTHUSE over the coming week.

Unfortunately, our arrival into Addis Ababa Bole airport yesterday did not initially bode well for the week ahead — Sean’s main camera (and all of its associated multi-faceted widgets) was, in essence, impounded (despite us being weighed down with the appropriate documentation, lists, letters, and visas). Luckily, however, we had a second camera with us, which Sean has ingeniously mod-ed so that the filming can go ahead. (We’re also hoping that Sean’s main camera will be released early next week.)

Today was spent exploring Addis Ababa, before we travel south to Adama tomorrow to prepare for the training course for high school physics teachers we’re running there next week. Addis Ababa is fascinating. Founded less than 150 years ago (by the Emperor Menelik II) it now has an estimated population of nearly 3.5M (although there are claims that the figure is actually closer to 5M) which, according to the Wiki page, is made up of 80 or more different nationalities speaking 80 different languages. Addis, or at least the region of Ethiopia in which its based, can also lay a strong claim to being the “birthplace” of humanity;  the skeleton of Lucy is preserved here at the National Museum of Ethiopia.

Addis is a city of deep contrasts — I was struck by the extent to which areas of relative affluence (shopping malls, cinemas, restaurants) exist practically side-by-side with shanty towns. Walking through the poorer areas of the city put all of my First World problems and concerns right into perspective; it was a humbling and unsettling — and necessarily unsettling  — experience.

We visited Entoto, a village in the suburbs of Addis Ababa close to the summit of Mount Entoto, passing by countless heavily-laden donkeys on the way there. Ethiopia has one of the highest donkey populations in the world and they are used to carry a wide range of commodities.  As discussed by Gebreab and colleagues, in Ethiopia donkeys make a major contribution to transport, and thus are a key contributor to the local economy.

An information-packed guided tour around the Emperor Menelik and Empress Taitu Memorial Museum at Entoto was followed by a visit to the Maryam Church, an octagonal construction (to represent the seven archangels + their god).

We then returned to Addis and met up with a number of the teachers who will be involved with the training programme. I’ll not tell you too much about the training programme itself for the moment, as it’ll be the focus of not only future blog posts but also a Sixty Symbols video (or two).

At this point we were all keen for a coffee break. The coffee plant originates in Ethiopia and so coffee is very much part of the culture here.  As someone with a long-standing interest in all things caffeine-related (including the deep links between coffee and quantum physics), I was particularly keen to have a coffee in the birthplace of the drink. We stopped off at a traditional Ethiopian coffee shop and sampled the local ‘brew’. Let’s just say we were not disappointed…

The day finished off with a meal at the Yod Abysinnia restaurant. As a vegetarian I was a little concerned as to the variety of food that would be available — raw meat is very popular in Ethiopia — but I needn’t have been concerned. There were plenty of vegetarian options (as part of the fasting menu). Even better than the food, however, was the traditional Ethiopian music and dance; the performances were stunning. I’ll leave you with a short clip.

Another update tomorrow, internet connection willing, when we reach Adama.

 

 

 

 

 

If I hadn’t failed my exams, I wouldn’t be a professor of physics

I started writing this post a little after 06:00 am this morning, the time at which schools and colleges were officially permitted to start releasing A-level results to hundreds of thousands of students across England, Wales, and Northern Ireland. I vividly remember the stomach-churning sense of dread thirty years ago as I awaited my Leaving Certificate results (the ‘Leaving’ is the Irish equivalent of the A-level system), and empathise with all of those students across the country biting their nails and pacing the floor as I write this.

By far the best advice for A-level students I’ve read over the last week was an open letter by Geoff Barton, Headteacher of King Edward VI school, to his Year 13 students, published in the TES on Tuesday: “Worrying about A-level results won’t help. They are out of your control“. Barton’s article resonated with me for a number of reasons, not least because I’m an undergraduate admissions tutor. It was the following paragraphs, however, that really hit home:

I know this because it happens each year, and it happened to me all those years ago when I failed one of my A-levels.

And what 30 years of experience has shown me is that if you end up not getting your first – or even second – choice of university place and have a tense couple of days on the phone sorting out new plans through the clearing process, then you will look back on this as something positive.

I ended up at a university I had never visited. It proved to be the best thing that happened in my education. And, like me, each year students come back at Christmas from their first term at university telling us that the unexpected change of plans has worked out to be brilliant.

Fortunately, I didn’t fail any of my Leaving Certificate exams — extreme exam failure was to come later on in my academic career — and I went on to start my BSc in Applied Physics degree at Dublin City University the following month. DCU was a small university at the time and I made my choice to go there not on the basis of prestige or national/international ranking  — in any case, the pseudostatistical, pseudoscentific, faux-quantitiative nonsense of university league tables hadn’t yet been spawned back in 1985 — but solely on the sense of excitement and, indeed, ‘belonging’ I felt when I attended a DCU Physics open day. (I’ll not bang on about the dubious value of league tables again, except to repeat that many A-level students show a healthy and laudable cynicism when it comes to the numerology of university rankings.)

Barton’s point about exam failure is particularly well made. I’ve been a personal academic and pastoral tutor for undergraduate students at Nottingham for the last eighteen years and it is always heartbreaking to have to tell a tutee that they have failed exams or, worse, can’t progress on their preferred course. This, of course, feels like the end of the world to them: how can they ever recover from what they see as abject failure?

So I tell them that I failed Year 3 of my four year BSc degree in Applied Physics at DCU.

Badly.

Appallingly badly.

For a couple of exam papers I did little more than write my name on the cover sheet. This was because I was rather more focused on the band I was in at the time, returning home to Monaghan at weekends to rehearse/play gigs and using my revision time to write riffs, lyrics, and songs.

Not clever.

But if I hadn’t failed my third year exams, and had to resit the year, then I am absolutely certain that I would have similarly drifted through my fourth year and graduated with, at the very best, a low 2.2 or, most likely, a 3rd class degree. Failing my exams, in the words of a band whose songs we used to cover at the time, hit me “like a battering ram”. I repeated 3rd year and went into my final year with many orders of magnitude more motivation and commitment. I graduated with a 2.1 (the pass mark I was ‘carrying’ from my third year due to the resits didn’t, let’s say, work in my favour) — enough to take up a PhD.

Less than a year into my PhD I knew I wanted to pursue a career in academia. (For the reasons discussed here).

I recount this story to tutees and students who have failed exams to echo Barton’s advice that it really isn’t the end of the world when things don’t go to plan. I certainly don’t recommend failing exams as an effective study skill or as an efficient strategy for career development. Nonetheless, a failed exam or two can often act as a catalyst to improve a student’s overall motivation and performance.

But that’s enough about me. My secondary school and undergraduate days are so far in the past that my memories of those times have a subtle reddish hue. Let’s instead hear from Jason Patrone, who graduated last month from Nottingham with a thoroughly well-deserved 1st class hons BSc in Physics (and is featured on the front cover of the School’s most recent newsletter):

I got a C, D and E grade at A-level. I then worked for six years in a job I didn’t find rewarding, before making the decision to return to university in 2011. I did the Foundation Year because of the `non-standard’ A-level grades, getting an overall mark of 81% for the year. I then transferred to the BSc and for each year of the degree I secured a 1st class mark.

The second year of the BSc I found the most challenging. Would I have put the same effort in, come the 2nd year crunch time, if I had sailed through A-levels? I doubt it.

Whether it means a kick up the arse for a bogey year/bad results, or facing the harsh realities of a crap job, any glimpse at what bad results leads to — or even just a blunt reminder that you didn’t do what you know you are capable of — works wonders.

Or, as Barton so eloquently puts it in his open letter, “the reality is that sometimes it’s the unexpected events in our lives that are the richest and most rewarding.

 

[Edit 13/08/2015, 11:03 — Drat. Forgot to mention that the cartoon above is from the wonderful xkcd and that it’s made available under a Creative Commons licence.]

Lies, damned lies, and Ofsted’s pseudostatistics

networking.jpg

First published at physicsfocus.

It’s been a week since Michael Gove was unceremoniously given the boot from his role as Education Secretary. The cheers of teachers still echo around staff rooms and schoolyards up and down the country.

Gove was variously described as incredibly unpopular, a hate figure, utterly ruthless, and a “toxic liability”. And that was just by his colleagues in the Coalition. (Allegedly.) Those who shared his simple-minded, wilfully uninformed, and proto-Victorian views on education, including a certain Richard Littlejohn, saw Gove’s unpopularity as arising simply because he was driving through what they considered to be essential reforms of an ailing education system. (My deep apologies for the preceding link to a Daily Mail article and its associated sidebar of shame. It won’t happen again. I also offer a posthumous apology to those Victorians who would likely have baulked at the suggestion that their educational methods were as backward-looking as those of Gove.)

Just why are Littlejohn and his reactionary ilk so certain that the English education system is, as they’d have it, going to hell in a handcart? A very large part of the reason is that they naively, quaintly, yet dangerously assume that education is equivalent to a competitive sport where schools, teachers, and children can be accurately assessed on the basis of positions in league tables. What’s worse – and this is particularly painful for a physicist or, indeed, anyone with a passing level of numeracy, to realise – is that this misplaced and unscientific faith in the value of statistically dubious inter-school comparisons is at the very core of the assessment culture of the Office for Standards in Education, Children’s Services and Skills (Ofsted).

An intriguing aspect of the swansong of Gove’s career as Education Secretary was that he more than once ‘butted heads’ with Michael Wilshaw, head of Ofsted. One might perhaps assume that this was a particularly apposite example of “the enemy of mine enemy is my friend”. Unfortunately not. Ofsted’s entirely flawed approach to the assessment of schools is in many ways an even bigger problem than Gove’s misplaced attempts to rewind education to the halcyon, but apocryphal, days of yore.

Moreover, Gove’s gone. Ofsted is not going anywhere any time soon.

I’ve always been uncomfortable about the extent to which number-abuse and pseudostatistics might be underpinning Ofsted’s school assessment procedures. But it was only when I became a parent governor for my children’s primary school, Middleton Primary and Nursery School in Nottingham, that the shocking extent of the statistical innumeracy at the heart of Ofsted’s processes became clear. (I should stress at this point that the opinions about Ofsted expressed below are mine, and mine alone.)

Middleton is a fantastic school, full of committed and inspirational teachers. But, like the vast majority of schools in the country, it is subject to Ofsted’s assessment and inspection regime. Ofsted’s implicit assumption is that the value of a school like Middleton, and, by extension, the value of the teachers and students in that school, can be reduced to a set of objective and robust ‘metrics’ which can in turn be used to produce a quantitative ranking (i.e. a league table). Even physicists, who spend their career wading through reams of numerical data, know full well that not everything that counts can be counted. (By the way, I use the adjective “inspirational” unashamedly. And because it winds the likes of Littlejohn and Toby Young up. As, I’d imagine, does starting a sentence with a conjunction and ending it with a preposition.)

But let’s leave the intangible and unquantifiable aspects of a school’s teaching to one side and instead critically consider the extent to which Ofsted’s data and processes are, to use that cliché beloved of government ministers, fit for purpose. In its advice to governors, Ofsted – rather ironically, as we’ll see — stresses the key importance of objective data and highlights that the governing board should assess the school’s performance on the basis of a number of measures which are ‘helpfully’ summarised at websites such as the Ofsted Data Dashboard and RAISE Online.

Ofsted’s advice to governors tacitly assumes that the data it provides, and the overall assessment methodology which gives rise to those data, are objective and can be used to robustly monitor the performance of a given school against others. Let’s just take a look at the objective evidence for this claim.

During the governor training sessions I attended, I repeatedly asked to what extent the results of Ofsted inspections (and other Ofsted-driven assessment schemes) were reproducible. In other words, if we repeated the inspection with a different set of inspectors, would we get the same result? If not, in what sense could Ofsted claim that the results of an inspection were objective and robust? As you might perhaps expect, I singularly failed to get a particularly compelling response to this question. This was for a very good reason: the results of Ofsted inspections are entirely irreproducible. A headline from the Telegraph in March this year said it all: Ofsted inspections: You’d be better off flipping a coin. This was not simply media spin. The think-tank report, “Watching the Watchmen”, on which the article was based, actually goes further: “In fact, overall the results are worse than flipping a coin”.

It’s safe to say that the think-tank in question, Policy Exchange, is on the right of the political spectrum. It is also perhaps not entirely coincidental that one of its founding members was a certain Michael Gove, and that the Policy Exchange report on Ofsted was highlighted by the right-of-centre press during the period of spats between Wilshaw and Gove mentioned above. None of that, however, detracts from the data cited in the report. These resulted from the work of Robert Coe and colleagues at Durham University and stemmed from a detailed study involving more than 3000 teachers. Coe has previously criticised Ofsted’s assessment methods in the strongest possible terms, arguing that they are not “research-based or evidence-based”.

Ofsted asks governors to treat its data as objective and make conclusions accordingly. However, without a suitable ‘control’ study – which in this case is as simple as running independent assessments of the same class with different inspectors – the data on inspections simply cannot be treated as objective and reliable. In this sense, Ofsted is giving governors, schools, and, more generally, the public exceptionally misleading messages.

But it gets worse…

The lack of rigour in Ofsted’s inspections is just one part of the problem. It’s compounded in a very worrying way by the shocking abuse of statistics that forms the basis of the Data Dashboard and RAISE Online. Governors are presented with tables of data from these websites and asked to make ‘informed’ decisions on the basis of the numbers therein. This, to be blunt, is a joke.

It would take a lengthy series of blog posts to highlight the very many flaws in Ofsted’s approach to primary and secondary school data. Fortunately, those posts have already been written by a teacher who has to deal with Ofsted’s nonsense on what amounts to a daily basis. I thoroughly recommend that you head over to the Icing On The Cake blog where you’ll find this, this, and this. The latter post is particularly physicist-friendly, given that it invokes Richard Feynman’s “cargo cult science” description of pseudoscientific methods (in the context of Ofsted’s methodology). It’s also worth following Icing On The Cake on Twitter if you’d like regular insights into the level of the data abuse which teachers have to tolerate from Ofsted.

Coincidentally, I stumbled across that blog after I had face-palmed my way (sometimes literally) through a meeting in which the Ofsted Data Dashboard tables were given to governors. I couldn’t quite believe that Ofsted presented the data in a way such that the average first-year physics or maths undergraduate could drive a horse and carriages right through it (if you’ll excuse the Goveian metaphor). So I went home and googled the simple term “Ofsted nonsense”. Right at the top of the list of hits were the Icing On The Cake posts (followed by links to many other illuminating analyses of Ofsted’s assessment practices).

I’m not going to rehash those posts here – if you’ve got even a passing interest in the education system in England you should read them (and the associated comments threads) for yourself and reach your own conclusions. To summarise, the problems are multi-faceted but can generally be traced to simple “rookie” flaws in data analysis. These include:

  1. Inadequate appreciation of the effects of small sample size;
  2. A lack of consideration of statistical significance/uncertainties in the data. (Or, at best, major deficiencies in communicating and highlighting those uncertainties);
  3. Comparison of variations between schools when the variation within a given school (from year to year) can be at least as large;
  4. An entirely misleading placement of schools in “quintiles” when the difference between the upper and lower quintiles can be marginal. Ofsted has already had to admit to a major flaw in its initial assignment of quintiles.

What is perhaps most galling is that many A-level students in English schools will be taught to recognise and avoid these types of pitfall in data analysis. It is an irony too far that those teaching the correct approach to statistics in English classrooms are assessed and compared to their peers on the basis of Ofsted’s pseudostatistical nonsense.

Image: Manipulating data. Credit: https://www.publicdomainpictures.net/en/view-image.php?image=45242

The vacuity of ‘excellence’

 

800px-Dark_Rift_2012

Originally published at physicsfocus

This post has been simmering and in gestation for quite a while. This week, however, a number of documents arrived in my inbox to finally catalyse me into putting pen to paper. (Literally. I wrote this out long-hand before editing while typing it up. If you think that it’s vitriolic and ranty now, you should have seen the first scribbled draft…)

The source of my irritation? Well, take a look at the five statements below, each culled from the website of a leading UK university. (The names of the institutions have been omitted to protect the guilty).

 “Through research of international excellence, to increase significantly the range of human knowledge and understanding…”

“We seek the highest distinction in research and scholarship and are committed to excellence in all aspects of education and transmission of knowledge.”

“By bold innovation and excellence in all that we do, we make both knowledge and discoveries matter. “

“.. we want to rise further and be amongst the very few premier global universities. We will achieve this through the excellence of our research and teaching…”

“The University …. combines academic excellence with an innovative and entrepreneurial approach to research…”

Do you see a common theme here? Yep, it’s that word – “excellence”. (Those are just five examples out of countless others. Go to any university website and type in “excellence” into the search box – you’ll be swamped by links.)

It’s not only the marketing blurb for universities that is riddled with references to excellence. The tagline for Research Councils UK is “Excellence with Impact”; UK academics have just been subjected to the rigours of data collection for HEFCE’s Research Excellence Framework (and the associated game-playing over just who is “excellent” and who isn’t); OFSTED has its “excellence gateway”; the NHS is “energised for excellence”, and even the British Parking Association celebrates parking excellence.

But what does “committed to excellence” actually mean?

Here’s what it means: Nothing. Absolutely nothing. It’s nothing more than the worst form of tedious, clichéd, vacuous, buttock-clenchingly awful marketing hyperbole.

What else is a university, or any type of organisation, going to do than try to be excellent? Strive for mediocrity?  Pursue adequate performance? Try to be a little better than the rest, but not aim too high?

Ugh.

Seventeen years ago, in The University in Ruins, Bill Readings described the many problems resulting from academia’s reliance on the nebulous concept of excellence. (Thanks to my colleague at Nottingham, John Holmwood, for making me aware of Readings’ excellent book). Here’s one particularly insightful argument:

“The point is not that no-one knows what excellence is, but that everyone has his or her own idea of what it is. And once excellence has been accepted as an organizing principle, there is no need to argue about differing definitions… if a particular department’s kind of excellence fails to conform, then that department can be eliminated without apparent risk to the system.”

(In this context, changing the name of the UK’s national research assessment exercise to the Research Excellence Framework makes a great deal of sense.)

Readings goes on to discuss what he describes as the “empty notion of excellence”. There’s an important concept in semiotics which captures this vacuity: the empty (or floating) signifier. An empty signifier is literally meaningless – it doesn’t represent any particular object or meaning which is universally agreed. “Excellence” is as good an example of an empty signifier as one could hope to find.

It takes a particularly insidious form of hypocrisy for UK universities to argue that they will develop the critical thinking skills of their students while at the same time they proclaim a commitment to excellence in everything they do. Laurie Taylor’s wonderful spoof University of Poppleton, with its commitment to being “fair to middling at everything”, at least has the advantage of a clear and original mission statement.

Image: Space is mostly vacuum, but it’s not nearly as empty as meaningless commitments to “excellence”. Credit: NASA/ESA