Thursday, 12 January 2017


The first of these three things I have mentioned in this blog before. I spend a lot of time taking apart fatal aviation accidents, looking for the influences and factors which came together in the unique combination that allowed each ‘accident’ to happen. In recent years one of the most common precursors to a crash is procedural non-compliance – deviation from standard operating procedures by one or more of those involved. The reasons that pilots and other professionals deviate in this way are many and often related to complex human behavioural conditions; it is worth looking at Abraham Maslow’s hierarchy of human needs for insight into some of them. Whatever the underlying reasons, prior to a crash the captain will frequently decide to do something contrary to their training and procedures, that will eventually lead to their own demise. Sadly their co-pilots often look on, aware that all is not well but saying nothing.

The second thing is ‘risk denial’; maybe I have raised it in earlier posts. This is a condition that arises when we are regularly exposed to a particular, perhaps severe, hazard but it never actually does us any harm. Over time we may subconsciously adopt a mind-set that whilst the severity could be very high, the probability or likelihood is so low that it can be disregarded. Imagine passing a heavy truck in the opposite direction on a narrow lane – the obvious action would be to slow down and pull in to the side of the road to let it pass safely but every time you have passed a truck no harm has come of it. So you are able to ‘deny’ the risk, despite its blatancy, and drive on as normal with a metre or less between you and death.

So here is the final thing and I wonder if there is a connection between the three? Modern movies, TV shows and most significantly computer games allow us to experience horrifically dangerous and deadly situations without suffering any (other than perhaps psychological) harm. Could that have led to a general conditioning of westernised humanity (including pilots) to be able to subconsciously ignore hazards and adopt risky behaviours on the assumption that we will come to no harm however bad things look? After all, passengers now routinely collect their baggage before evacuating an aircraft, despite the high risk of fire and explosion. I don’t know…

Tuesday, 25 October 2016


After every mission military pilots and their crew will hold a debrief to discuss want went well and what could have been done better. This is an opportunity for those involved in the task to recognise superior performance and to learn by addressing any deficiencies – makes sense right?

However, post-flight debriefs in commercial aviation are quite rare, in spite of the obvious potential value. In many regulatory jurisdictions pilots and cabin crew are only considered to be ‘on duty’ until 30 minutes after ‘on-chocks’ time so there is a limited window of opportunity. Furthermore, the crew for the next scheduled flight will frequently be waiting to get on board to maximise preparation time during the short turnaround period. Perhaps most influential is the fact that it isn’t ‘the way we do things around here’ – it’s not part of the culture.

A pilot would think nothing of remarking on a colleague’s smooth landing in difficult crosswind conditions for example but there is unlikely to be much discussion about it if it went less well. The absence of a debrief effectively implies that the entire flight proceeded satisfactorily and in accordance with standard operating procedures: there were no errors, deviations, distractions and consequently no opportunities to learn. Any pilot knows that is never the case but without a professional conversation immediately afterwards, the implication becomes reinforced.

Pilots are well used to debriefs after training flights so why not after every flight?


'Culture', a term we hear used a lot in aviation these days; organisational culture, reporting culture, safety culture, just culture... You name it, there's an applicable culture. But what does it mean?

If you look it up in a dictionary it will say something like 'common behaviours and beliefs shared by a group', and offer examples such as youth culture or drug culture. Alternatively you might hear it described as 'the way we do things around here'. These definitions imply that everyone in a group does certain things the same way, so how does that happen?

An example that comes to mind relates to the flashing of car headlights. Here in my native England, if someone else flashes their headlights at me I understand it to mean 'go ahead', and so does just about everyone else. However, I spent 17 years resident in Dubai, UAE, where flashing headlights means (emphatically) 'get out of my way!' One step further, if you are lucky enough to experience the joy of driving in Nairobi, Kenya, you will find a more complex flash-code. One long flash equates to 'don't go there', whereas two short flashes means 'please go ahead'.

Now nowhere are these 'languages' written down but you can see that failure to assimilate them quickly could result in an unpleasant outcome - we humans are fast learners when it comes to survival. That is the way they do things around here and so we will do it too. An important lesson in culture; we tend to emulate those around us, whether we think it is right or wrong.

As a post script it is worth considering the likelihood of misunderstandings as a result of modern automatic headlight systems.

Tuesday, 6 September 2016


So the preliminary findings of the investigation into the Boeing 777 that crashed on landing in Dubai in August are that the pilots touched down ‘long’ and elected to initiate a ‘baulked landing’ manoeuvre, presumably to reduce the risk of an overrun. Baulked landing is very similar to a go-around – press the TOGA (take-off/go-around) switches to automatically increase thrust (the 777 has two levels of thrust response depending on the number of presses), partly retract the flap and once climb is established raise the gear. However, with wheels on the ground the TOGA switch thrust response is inhibited so for a baulked landing the pilots must advance the throttles manually. A subtle but crucial difference, which if not practiced regularly, may be overlooked in the heat of the moment.

Automation has all sorts of benefits in modern aircraft but due to the very wide range of operating environments and manoeuvres, it has different regimes of logic for different phases of flight. If these regimes are not fully understood, together with the conditions that bring about the transition from one to another, then the automation and hence the aircraft, may not respond as expected. This is not unique to Boeing and Airbus aircraft have suffered accidents for similar reasons. Today’s pilots must learn to ‘think like their aeroplane thinks’…

Monday, 5 September 2016


This is a bit of fun but it also has a serious side - give it a try and see how your organisation fares. We put it together to give you the opportunity for a bit of honest self-analysis of the culture with regard to safety and risk in your business. It doesn't purport to be a comprehensive analysis but it should give you an insight into how things are going.

If you come out with a score of 15 - 17 things are probably going pretty well but less than 10 could indicate that you have some systemic cultural and/or organisational safety issues which need to be addressed. At Gates Aviation we have a collaborative and realistic approach to resolving these issues without turning the business on its head. Give Sean Gates a call on +44 (0)207 4696437 or e-mail .

Score 1 for ‘True’, 0.5 for ‘Part true’ and 0 for ‘False’
Part true
The organisation has a clear safety policy:
There is a policy statement with respect to safety and risk, that is written in simple and clear language, agreed by senior management and signed by the CEO/MD/Accountable Manager

The safety policy reflects reality:
The terms of the policy reflect the genuine intent of the organisation’s management with regard to the safety of people, property and the environment

The organisation has clear safety objectives:
There are a number of clearly stated and generally SMART safety objectives (2-6), which reflect the specific goals of the organisation with regard to safety and risk

The safety objectives directly support the safety policy:
There is a recognisable link between the goals stated in the safety objectives and the intent implicit in the safety policy

Safety activities and initiatives directly support the objectives:
The allocation of resources, the activities of the safety department and the safety initiatives of the organisation demonstrably support the objectives

The safety objectives are widely known and understood:
Most personnel, especially those in front line safety critical roles, can articulate at least the intent of the safety objectives

The safety objectives have meaningful performance indicators:
Each safety objective has one or more metric or performance indicator (SPI), which genuinely measures the organisation’s progress with respect to that objective

The performance indicators have valid targets:
The organisation has defined realistic and achievable targets for each SPI, and there is a process to review the targets regularly

Performance in relation to targets is regularly reviewed:
Senior management has a process to review safety performance as indicated by the SPIs and the achievement of targets

Failure to meet a performance target is examined at senior level:
Failure to meet a performance target in the allocated time is analysed by senior management and the reasons for failure identified and addressed

Safety performance data is shared throughout the organisation:
Safety performance as indicated by the SPIs and targets is disseminated to all personnel in an appropriate and understandable format

Reporting of safety incidents and accidents is a requirement:
All personnel have an explicit and contractual obligation to report safety incidents and accidents via an established safety reporting programme

Reporting of hazards and near-misses is encouraged:
Personnel understand what constitutes a hazard and a near-miss in safety terms and are positively encouraged to report them

Incidents, accidents, hazards and near-misses are investigated:
There is a documented process to ensure that reported safety issues receive an appropriate level of investigation by trained safety investigators

Reporters are treated fairly:
Originators of safety reports are treated in a fair and consistent manner, are assured of an appropriate degree of confidentiality, and always receive acknowledgement and feedback

Acceptable and unacceptable behaviours are clearly defined:
There are documented definitions of what constitutes acceptable and unacceptable behaviour with regard to safety and risk

Disciplinary processes are clear and consistent:
The consequences for an individual found to have behaved in an unacceptable manner with regard to safety and risk are clearly defined and always consistent

Total score:

Monday, 1 August 2016


The following is a 'guest post' from my Gates Aviation colleague John Edwards:

I have recently completed a risk assessment of the safety and security of an airline's crews, engineers and aircraft while operating in a high risk environment during the period of a proposed wet lease. The 'donor' airline was registered in a state that has, and the airline itself has embraced, a relatively high level of risk aversion.

 It is well known that there has been a longstanding history of terrorist attacks against commercial (and military) aviation in the location where the personnel and aircraft would be based and that numerous fatalities have resulted.

 The state authorities advise that that "all foreigners, may not move out of their city of residence without proper security and without prior coordination with the law enforcement agency". The approach taken to providing security in civilian life, especially in relation to what might be considered 'soft targets' e.g. selected public highways and shopping malls, was evident and likely to aid deterrence and detection (of terrorists and planned attacks). There was meaningful evidence to suggest efforts had been taken to harden these (previously soft) targets and protect the related communities.

 Security had been tightened at the state's international airports following a number of terrorist attacks in 2014 and further strengthened in 2015. A historic ban of locally registered airlines from operating into the EU for safety reasons had been lifted in 2015. ICAO and the USA consider that implementation of ICAO aviation safety standards in the state to exceed the global average. The airline with which the wet lease was proposed, is a member of IATA and therefore when last audited met the requirements of IOSA. Viewed in combination these facts provided evidence that the national aviation security and safety culture and infrastructure are widely considered to be sound. The main roads between the airports and hotels feature multiple manned checkpoints and the hotels where the crews and engineers were most likely to be staying, had robust security measures in place. The better quality shopping malls had entry search points operated by the military or other government agencies. It is known that places of worship and large public gatherings should be avoided and that periods leading-up to political elections can see increased civil tension and unrest. As in most business sectors and risk management environments, there was scope for continuous improvement to aviation security processes and procedures and implementation of best practices. But I considered existing measures to be relevant, adequate and sound. Accordingly, I concluded that subject to these measures being maintained, to the threat level not being raised to 'red' (the highest level) and to good situational awareness being exercised by operational personnel from the donor airline, when they are on site, the wet lease should not materially increase risk exposure. My assessment was that there are no substantive safety or security reasons why the proposed wet lease should not proceed as planned.

Wednesday, 27 July 2016


Human performance deficiencies were identified in the investigations of all 11 accidents and human factors overwhelmingly dominated the lists of causes and contributory factors. The study set out to analyze these deficiencies against two separate models, the Dupont ‘Dirty Dozen’ and the Pilot Competencies Model, as described at the outset. All of the accidents had identifiable factors from both models; indeed some accidents appeared to include deficiencies in virtually all of the Pilot Competencies together with many of the ‘Dozen’.

The graph below shows the number of accidents from the original 11, in which each of the markers from the Pilot Competencies Model was found to be deficient:

It is perhaps not surprising that the pilots in every accident exhibited deficient Situational Awareness – had they been more aware of the situation it seems likely that they would have taken more appropriate action to avoid terrain proximity. It is disappointing to see that the pilots in all accidents were also deficient in the application of their procedures but on the other hand this may be encouraging, in that procedural compliance is possibly indicated as a significant factor in the mitigation of CFIT risk. Poor communication was also identified in most (8) of the accidents, supporting a view that CRM (crew resource management) in general and communication in particular, are vital for the avoidance of CFIT.

The next graph illustrates the same data for the Dupont ‘Dirty Dozen’ markers:

Once again Lack of Awareness tops the table, appearing in all of the 11 accidents and reflecting the deficient Pilot Competency of Situational Awareness above. There is no Dupont marker that corresponds to the Pilot Competency of Application of Procedures so we can’t see a correlation but Lack of Communication again appears in the same 8 accidents. Lack of Teamwork and Lack of Assertiveness were identified in over half of the accidents and these two markers are in some ways related. If the Captain in particular is not a ‘team player’ and fails to respect colleagues and their opinions, the FO may become isolated and feel unable to intervene, even to save their own life. This risk is exacerbated by a steep authority gradient in the cockpit. Norms featured as a marker in almost half (5) of the accidents, when pilots developed and employed their own processes, either when they found that the promulgated process was inefficient, ineffective or difficult, or when there was no applicable process for them to employ.

Stress, Pressure, Fatigue, Complacency and Distraction each appeared in only 3 or fewer of the accidents but these conditions are sometimes difficult to identify from investigation reports. Unless the CVR records specific and attributable voice characteristics, or the individual mentions that they are affected by a condition, it may be that the condition goes unnoticed in the investigation.

It is worthy of note that neither model specifically addresses markers for Monitoring and/or Cross-checking, although Lack of Awareness and Situational Awareness respectively could be taken to include those competencies. Deficiencies in monitoring and cross-checking were apparent in several of the accidents. It is precisely these functions, functions we know humans perform quite poorly, that the EGPWS/GPWS/TAWS seeks to augment in CFIT prevention.

Finally, the graph below combines the two models to show the number of markers associated with each of the 11 accidents:

Accidents with a greater number of markers from one model also appear to have a similarly greater number from the other model – total markers per accident varied from 7 to 14 but the variation between models was never greater than 2. This may indicate that the two models identify similar human performance deficiencies but it may also be a reflection of the amount of information available from the accident reports upon which the study was based. These varied from a few pages to well over one hundred.