Friday, 13 July 2018


You know how it is. Introduced to a complete stranger at a polite drinks party, once you have exhausted the pleasantries, one of you is going to ask "What do you do then?" My wife has taken to interjecting with "He's a pilot", which was once true of course, because she has grown tired of watching me struggle to explain what I actually do. With a little time to think, here are some of the things I have done in the last couple of years:

Written and presented a course on Compliance Monitoring to an airline in Nepal, which almost unbelievably operates helicopters to Everest Base Camp as a routine, and exceptionally up to Camp 2 in emergencies;
Closely engaged with an ambitious and rapidly expanding airline in the UK, helping to develop an organisational culture which embraces the concept of a Management System as defined in ORO.GEN.200;
Spoken on the merits of procedural compliance at the Eurocontrol Safety Behaviours Forum in Brussels;
Run a human performance and error management workshop for a large group of anaesthetists at a hospital in Essex - interestingly they invited me to observe the teams in action for two days in live theatres ahead of the workshop, to better understand their work (challenging for someone who doesn't do blood very well);
Conducted an audit of a UK operator's management system, using the new EASA Management System Assessment Tool for the first time;
Acted as an expert witness in flight operations and safety in a European airline's defence against injury claims (all successfully to date);
On the subject of expert witnesses, I have developed and delivered a course on how to present expert evidence (see below);

Lukla, NEPAL
Delivered Safety Management System initial and recurrent training to another Nepalese operator - this one ferries thousands of trekkers each year in and out of Lukla, gateway to Everest and one of the world's more challenging destinations;

Led a safety review of an extreme aviation sports portfolio in the Middle East - skydiving, paramotors, gyrocopters etc;
Delivered Upset Prevention & Recovery Training (UPRT) to a group of flight instructors in Lithuania;
Presented seminars on Evidence Based Training to operators and industry professionals from around Europe and Africa;
Trained departmental Risk Champions for a UK operator;
Developed and directed several emergency response exercises;

You see why my wife says I'm a pilot...

Thursday, 28 June 2018


Are you an expert in your field? If you are there is some possibility that you may be asked to act as an expert witness in a court case or arbitration. While you no doubt know your own subject very well, the first time you prepare written evidence for the court and your first physical appearance in the witness box can seem quite daunting. Firstly, there are strict rules around the instruction of experts. Secondly, court procedure can be rather confusing. And finally, an experienced legal professional is going to do his or her best to dismantle your evidence and discredit your expertise.

Gates Aviation has recognised this challenge and recently launched a familiarisation programme for prospective expert witnesses. This consists of a four hour face-to-face training session with a seasoned expert witness at a location of your choice, and an escorted visit to court when experts are giving evidence. It doesn't matter what your field of expertise, this is all about delivering evidence rather than what's in it.

If you want to be confident that your evidence is compliant and that you know what to expect in court, contact Gates Aviation to discuss this programme of expert witness preparation.

Tuesday, 17 April 2018


During audits and safety meetings I have often been asked ‘what are your top 5 risks?’ I have a problem with that question…

Let me start with a closer look at how the aviation industry historically quantifies ‘risk’. Once the system identifies a ‘hazard’ in the operating environment, we reach for a risk matrix of some kind – typically based on the 5 x 5 example described in ICAO Doc 9859, the Safety Management Manual. You know the one: ‘severity’ along one axis and ‘probability’ along the other.

I don’t really have a problem with the severity scale; it seems quite reasonable to imagine what the ‘worst case feasible outcome’ of the hazard could be and attach a severity in relation to the word pictures associated with the scale. But what about probability? Across the scale you will usually see 5 possible choices ranging from ‘very likely’ to ‘very unlikely’ or similar. What do they mean? If you look in the dictionary for ‘likely’ it will say something like ‘such as well might happen or be true; probable’ but that won’t mean a lot to a risk assessor. To help we tend to develop simpler word pictures to try and make the choice easier and more consistent or we might add a mathematical probability like ‘once in 10,000 flights’.

The trouble is that, once we have accepted that there is a probability of greater than zero, we need to be prepared for the outcome to happen at any time. Even if the probability is once in 10,000,000 flights, that accepts that it could occur on the next flight or the 10,000,000th one, or anywhere in between. So for any activity that we propose to repeat indefinitely, like going flying, we must accept an inevitable occurrence whatever the probability.

Can I tell you what my ‘top 5’ risks are? No. While each of my identified hazards may have differing probabilities, they do have a probability and I don’t know which is going to happen next.

Wednesday, 29 March 2017


A British national daily among other media outlets has been running with a story that a Bombardier Challenger business jet encountered wake turbulence from an Airbus A380 over the Indian Ocean. The story says that the encounter was so severe that the bizjet was rolled inverted and lost 10,000 feet in altitude. Photos of the cabin interior show total devastation and when the aircraft was diverted to Muscat, Oman, some passengers were taken to hospital.

But wait a minute, the A380 has been in service for over 10 years and there are now more than 200 of them criss-crossing the skies every day. Much of the world's upper airpsace is operated on reduced vertical separation minima (RVSM), meaning that vertical separation between opposite direction aircraft is 1,000 feet.

So why hasn't this happened before? We know that the A380 has a higher wake turbulence category but if it was dragging around vortices capable of inverting a sizeable business jet, surely there would have been more severe wake turbulence reports by now?

Or perhaps there is something we don't know...

Tuesday, 31 January 2017

COGNITIVE DISSONANCE - a factor in 'Pressonitis'?

Firstly I should make it clear that I am not a psychologist, nor in the truest sense of the word am I a scientist, although as an aviator I have a broad understanding of a lot of science. My knowledge of this topic in particular comes from extensive research into why pilots were flying approaches to land - the ‘approach’ being the last part of the flight descending towards the runway - when all of the available evidence indicated that the landing could not be achieved either safely or in compliance with operating procedures. The approach trajectory was either too steep or too shallow, the aircraft was too fast or too slow or the landing gear and flaps were not in the correct configuration. Pilots’ standard operating procedures required them to execute a ‘go-around’ in such circumstances, to abandon the approach, climb away safely and start again but some were simply not complying. This ‘unstable approach’ phenomenon as it is known, has been one of the most common contributory factors in commercial aviation accidents over the last 30 years or more but the tendency to press on in spite of the evidence is not unique to pilots.

This brought me to the work of Bluma Zeigarnik, a psychologist and psychiatrist born in Lithuania at the turn of the last century. She is probably best known for studies inspired by her Professor’s observation that a waiter appeared to have a much better recollection for orders that had yet to be paid for, than those which had already been settled. The waiter’s workflow involved taking the order, delivering the food and drinks and finally taking the money, at which point the workflow would be finished. He stored the order in his memory until the customer had paid and then subconsciously dumped it. In other words an incomplete pattern of work held a much higher priority for retention in the memory than one which was effectively completed.

Zeigarnik went on to study school children learning in class and found that those who were interrupted in the course of their work remembered more, and more accurately, than those who were allowed to finish without interruption. In isolation that is interesting but doesn’t tell us a great deal. However, Zeigarnik and her successors have shown that the increased memory retention is attributable to a heightened level of cognitive arousal whilst a task is being conducted, which is replaced by a more satisfied lower arousal once the task is successfully completed. The heightened cognitive arousal was in turn attributed to a degree of discomfort that the goal may fail, discomfort that could only be assuaged by success. Nowadays we know this as the ‘Zeigarnik Effect’. To take it one step further, research suggested that humans remember bad things more clearly than they remember the good things; perhaps from a survival perspective this makes sense – we remember what has done us harm so that we can avoid it in future.

So finally, the outcome of this ‘cognitive dissonance’, the disparity between aspiration and reality during the conduct of a task, is that we humans harbour a compelling desire to complete a task once we have commenced it. This can be so compelling that we may press on although all of the indications, our instincts and maybe even our own colleagues are telling us to stop and rethink the strategy. This is what we found with the ‘unstable approaches’ continued to landing – pilots had become so focused on achieving the goal that they were able to ignore the evidence that it was failing – and it probably applies to many other aspects of professional and personal life.

Thursday, 12 January 2017


The first of these three things I have mentioned in this blog before. I spend a lot of time taking apart fatal aviation accidents, looking for the influences and factors which came together in the unique combination that allowed each ‘accident’ to happen. In recent years one of the most common precursors to a crash is procedural non-compliance – deviation from standard operating procedures by one or more of those involved. The reasons that pilots and other professionals deviate in this way are many and often related to complex human behavioural conditions; it is worth looking at Abraham Maslow’s hierarchy of human needs for insight into some of them. Whatever the underlying reasons, prior to a crash the captain will frequently decide to do something contrary to their training and procedures, that will eventually lead to their own demise. Sadly their co-pilots often look on, aware that all is not well but saying nothing.

The second thing is ‘risk denial’; maybe I have raised it in earlier posts. This is a condition that arises when we are regularly exposed to a particular, perhaps severe, hazard but it never actually does us any harm. Over time we may subconsciously adopt a mind-set that whilst the severity could be very high, the probability or likelihood is so low that it can be disregarded. Imagine passing a heavy truck in the opposite direction on a narrow lane – the obvious action would be to slow down and pull in to the side of the road to let it pass safely but every time you have passed a truck no harm has come of it. So you are able to ‘deny’ the risk, despite its blatancy, and drive on as normal with a metre or less between you and death.

So here is the final thing and I wonder if there is a connection between the three? Modern movies, TV shows and most significantly computer games allow us to experience horrifically dangerous and deadly situations without suffering any (other than perhaps psychological) harm. Could that have led to a general conditioning of westernised humanity (including pilots) to be able to subconsciously ignore hazards and adopt risky behaviours on the assumption that we will come to no harm however bad things look? After all, passengers now routinely collect their baggage before evacuating an aircraft, despite the high risk of fire and explosion. I don’t know…

Tuesday, 25 October 2016


After every mission military pilots and their crew will hold a debrief to discuss want went well and what could have been done better. This is an opportunity for those involved in the task to recognise superior performance and to learn by addressing any deficiencies – makes sense right?

However, post-flight debriefs in commercial aviation are quite rare, in spite of the obvious potential value. In many regulatory jurisdictions pilots and cabin crew are only considered to be ‘on duty’ until 30 minutes after ‘on-chocks’ time so there is a limited window of opportunity. Furthermore, the crew for the next scheduled flight will frequently be waiting to get on board to maximise preparation time during the short turnaround period. Perhaps most influential is the fact that it isn’t ‘the way we do things around here’ – it’s not part of the culture.

A pilot would think nothing of remarking on a colleague’s smooth landing in difficult crosswind conditions for example but there is unlikely to be much discussion about it if it went less well. The absence of a debrief effectively implies that the entire flight proceeded satisfactorily and in accordance with standard operating procedures: there were no errors, deviations, distractions and consequently no opportunities to learn. Any pilot knows that is never the case but without a professional conversation immediately afterwards, the implication becomes reinforced.

Pilots are well used to debriefs after training flights so why not after every flight?