Tuesday, 17 April 2018


During audits and safety meetings I have often been asked ‘what are your top 5 risks?’ I have a problem with that question…

Let me start with a closer look at how the aviation industry historically quantifies ‘risk’. Once the system identifies a ‘hazard’ in the operating environment, we reach for a risk matrix of some kind – typically based on the 5 x 5 example described in ICAO Doc 9859, the Safety Management Manual. You know the one: ‘severity’ along one axis and ‘probability’ along the other.

I don’t really have a problem with the severity scale; it seems quite reasonable to imagine what the ‘worst case feasible outcome’ of the hazard could be and attach a severity in relation to the word pictures associated with the scale. But what about probability? Across the scale you will usually see 5 possible choices ranging from ‘very likely’ to ‘very unlikely’ or similar. What do they mean? If you look in the dictionary for ‘likely’ it will say something like ‘such as well might happen or be true; probable’ but that won’t mean a lot to a risk assessor. To help we tend to develop simpler word pictures to try and make the choice easier and more consistent or we might add a mathematical probability like ‘once in 10,000 flights’.

The trouble is that, once we have accepted that there is a probability of greater than zero, we need to be prepared for the outcome to happen at any time. Even if the probability is once in 10,000,000 flights, that accepts that it could occur on the next flight or the 10,000,000th one, or anywhere in between. So for any activity that we propose to repeat indefinitely, like going flying, we must accept an inevitable occurrence whatever the probability.

Can I tell you what my ‘top 5’ risks are? No. While each of my identified hazards may have differing probabilities, they do have a probability and I don’t know which is going to happen next.

Wednesday, 29 March 2017


A British national daily among other media outlets has been running with a story that a Bombardier Challenger business jet encountered wake turbulence from an Airbus A380 over the Indian Ocean. The story says that the encounter was so severe that the bizjet was rolled inverted and lost 10,000 feet in altitude. Photos of the cabin interior show total devastation and when the aircraft was diverted to Muscat, Oman, some passengers were taken to hospital.

But wait a minute, the A380 has been in service for over 10 years and there are now more than 200 of them criss-crossing the skies every day. Much of the world's upper airpsace is operated on reduced vertical separation minima (RVSM), meaning that vertical separation between opposite direction aircraft is 1,000 feet.

So why hasn't this happened before? We know that the A380 has a higher wake turbulence category but if it was dragging around vortices capable of inverting a sizeable business jet, surely there would have been more severe wake turbulence reports by now?

Or perhaps there is something we don't know...

Tuesday, 31 January 2017

COGNITIVE DISSONANCE - a factor in 'Pressonitis'?

Firstly I should make it clear that I am not a psychologist, nor in the truest sense of the word am I a scientist, although as an aviator I have a broad understanding of a lot of science. My knowledge of this topic in particular comes from extensive research into why pilots were flying approaches to land - the ‘approach’ being the last part of the flight descending towards the runway - when all of the available evidence indicated that the landing could not be achieved either safely or in compliance with operating procedures. The approach trajectory was either too steep or too shallow, the aircraft was too fast or too slow or the landing gear and flaps were not in the correct configuration. Pilots’ standard operating procedures required them to execute a ‘go-around’ in such circumstances, to abandon the approach, climb away safely and start again but some were simply not complying. This ‘unstable approach’ phenomenon as it is known, has been one of the most common contributory factors in commercial aviation accidents over the last 30 years or more but the tendency to press on in spite of the evidence is not unique to pilots.

This brought me to the work of Bluma Zeigarnik, a psychologist and psychiatrist born in Lithuania at the turn of the last century. She is probably best known for studies inspired by her Professor’s observation that a waiter appeared to have a much better recollection for orders that had yet to be paid for, than those which had already been settled. The waiter’s workflow involved taking the order, delivering the food and drinks and finally taking the money, at which point the workflow would be finished. He stored the order in his memory until the customer had paid and then subconsciously dumped it. In other words an incomplete pattern of work held a much higher priority for retention in the memory than one which was effectively completed.

Zeigarnik went on to study school children learning in class and found that those who were interrupted in the course of their work remembered more, and more accurately, than those who were allowed to finish without interruption. In isolation that is interesting but doesn’t tell us a great deal. However, Zeigarnik and her successors have shown that the increased memory retention is attributable to a heightened level of cognitive arousal whilst a task is being conducted, which is replaced by a more satisfied lower arousal once the task is successfully completed. The heightened cognitive arousal was in turn attributed to a degree of discomfort that the goal may fail, discomfort that could only be assuaged by success. Nowadays we know this as the ‘Zeigarnik Effect’. To take it one step further, research suggested that humans remember bad things more clearly than they remember the good things; perhaps from a survival perspective this makes sense – we remember what has done us harm so that we can avoid it in future.

So finally, the outcome of this ‘cognitive dissonance’, the disparity between aspiration and reality during the conduct of a task, is that we humans harbour a compelling desire to complete a task once we have commenced it. This can be so compelling that we may press on although all of the indications, our instincts and maybe even our own colleagues are telling us to stop and rethink the strategy. This is what we found with the ‘unstable approaches’ continued to landing – pilots had become so focused on achieving the goal that they were able to ignore the evidence that it was failing – and it probably applies to many other aspects of professional and personal life.

Thursday, 12 January 2017


The first of these three things I have mentioned in this blog before. I spend a lot of time taking apart fatal aviation accidents, looking for the influences and factors which came together in the unique combination that allowed each ‘accident’ to happen. In recent years one of the most common precursors to a crash is procedural non-compliance – deviation from standard operating procedures by one or more of those involved. The reasons that pilots and other professionals deviate in this way are many and often related to complex human behavioural conditions; it is worth looking at Abraham Maslow’s hierarchy of human needs for insight into some of them. Whatever the underlying reasons, prior to a crash the captain will frequently decide to do something contrary to their training and procedures, that will eventually lead to their own demise. Sadly their co-pilots often look on, aware that all is not well but saying nothing.

The second thing is ‘risk denial’; maybe I have raised it in earlier posts. This is a condition that arises when we are regularly exposed to a particular, perhaps severe, hazard but it never actually does us any harm. Over time we may subconsciously adopt a mind-set that whilst the severity could be very high, the probability or likelihood is so low that it can be disregarded. Imagine passing a heavy truck in the opposite direction on a narrow lane – the obvious action would be to slow down and pull in to the side of the road to let it pass safely but every time you have passed a truck no harm has come of it. So you are able to ‘deny’ the risk, despite its blatancy, and drive on as normal with a metre or less between you and death.

So here is the final thing and I wonder if there is a connection between the three? Modern movies, TV shows and most significantly computer games allow us to experience horrifically dangerous and deadly situations without suffering any (other than perhaps psychological) harm. Could that have led to a general conditioning of westernised humanity (including pilots) to be able to subconsciously ignore hazards and adopt risky behaviours on the assumption that we will come to no harm however bad things look? After all, passengers now routinely collect their baggage before evacuating an aircraft, despite the high risk of fire and explosion. I don’t know…

Tuesday, 25 October 2016


After every mission military pilots and their crew will hold a debrief to discuss want went well and what could have been done better. This is an opportunity for those involved in the task to recognise superior performance and to learn by addressing any deficiencies – makes sense right?

However, post-flight debriefs in commercial aviation are quite rare, in spite of the obvious potential value. In many regulatory jurisdictions pilots and cabin crew are only considered to be ‘on duty’ until 30 minutes after ‘on-chocks’ time so there is a limited window of opportunity. Furthermore, the crew for the next scheduled flight will frequently be waiting to get on board to maximise preparation time during the short turnaround period. Perhaps most influential is the fact that it isn’t ‘the way we do things around here’ – it’s not part of the culture.

A pilot would think nothing of remarking on a colleague’s smooth landing in difficult crosswind conditions for example but there is unlikely to be much discussion about it if it went less well. The absence of a debrief effectively implies that the entire flight proceeded satisfactorily and in accordance with standard operating procedures: there were no errors, deviations, distractions and consequently no opportunities to learn. Any pilot knows that is never the case but without a professional conversation immediately afterwards, the implication becomes reinforced.

Pilots are well used to debriefs after training flights so why not after every flight?


'Culture', a term we hear used a lot in aviation these days; organisational culture, reporting culture, safety culture, just culture... You name it, there's an applicable culture. But what does it mean?

If you look it up in a dictionary it will say something like 'common behaviours and beliefs shared by a group', and offer examples such as youth culture or drug culture. Alternatively you might hear it described as 'the way we do things around here'. These definitions imply that everyone in a group does certain things the same way, so how does that happen?

An example that comes to mind relates to the flashing of car headlights. Here in my native England, if someone else flashes their headlights at me I understand it to mean 'go ahead', and so does just about everyone else. However, I spent 17 years resident in Dubai, UAE, where flashing headlights means (emphatically) 'get out of my way!' One step further, if you are lucky enough to experience the joy of driving in Nairobi, Kenya, you will find a more complex flash-code. One long flash equates to 'don't go there', whereas two short flashes means 'please go ahead'.

Now nowhere are these 'languages' written down but you can see that failure to assimilate them quickly could result in an unpleasant outcome - we humans are fast learners when it comes to survival. That is the way they do things around here and so we will do it too. An important lesson in culture; we tend to emulate those around us, whether we think it is right or wrong.

As a post script it is worth considering the likelihood of misunderstandings as a result of modern automatic headlight systems.

Tuesday, 6 September 2016


So the preliminary findings of the investigation into the Boeing 777 that crashed on landing in Dubai in August are that the pilots touched down ‘long’ and elected to initiate a ‘baulked landing’ manoeuvre, presumably to reduce the risk of an overrun. Baulked landing is very similar to a go-around – press the TOGA (take-off/go-around) switches to automatically increase thrust (the 777 has two levels of thrust response depending on the number of presses), partly retract the flap and once climb is established raise the gear. However, with wheels on the ground the TOGA switch thrust response is inhibited so for a baulked landing the pilots must advance the throttles manually. A subtle but crucial difference, which if not practiced regularly, may be overlooked in the heat of the moment.

Automation has all sorts of benefits in modern aircraft but due to the very wide range of operating environments and manoeuvres, it has different regimes of logic for different phases of flight. If these regimes are not fully understood, together with the conditions that bring about the transition from one to another, then the automation and hence the aircraft, may not respond as expected. This is not unique to Boeing and Airbus aircraft have suffered accidents for similar reasons. Today’s pilots must learn to ‘think like their aeroplane thinks’…