Monday, 1 August 2016


The following is a 'guest post' from my Gates Aviation colleague John Edwards:

I have recently completed a risk assessment of the safety and security of an airline's crews, engineers and aircraft while operating in a high risk environment during the period of a proposed wet lease. The 'donor' airline was registered in a state that has, and the airline itself has embraced, a relatively high level of risk aversion.

 It is well known that there has been a longstanding history of terrorist attacks against commercial (and military) aviation in the location where the personnel and aircraft would be based and that numerous fatalities have resulted.

 The state authorities advise that that "all foreigners, may not move out of their city of residence without proper security and without prior coordination with the law enforcement agency". The approach taken to providing security in civilian life, especially in relation to what might be considered 'soft targets' e.g. selected public highways and shopping malls, was evident and likely to aid deterrence and detection (of terrorists and planned attacks). There was meaningful evidence to suggest efforts had been taken to harden these (previously soft) targets and protect the related communities.

 Security had been tightened at the state's international airports following a number of terrorist attacks in 2014 and further strengthened in 2015. A historic ban of locally registered airlines from operating into the EU for safety reasons had been lifted in 2015. ICAO and the USA consider that implementation of ICAO aviation safety standards in the state to exceed the global average. The airline with which the wet lease was proposed, is a member of IATA and therefore when last audited met the requirements of IOSA. Viewed in combination these facts provided evidence that the national aviation security and safety culture and infrastructure are widely considered to be sound. The main roads between the airports and hotels feature multiple manned checkpoints and the hotels where the crews and engineers were most likely to be staying, had robust security measures in place. The better quality shopping malls had entry search points operated by the military or other government agencies. It is known that places of worship and large public gatherings should be avoided and that periods leading-up to political elections can see increased civil tension and unrest. As in most business sectors and risk management environments, there was scope for continuous improvement to aviation security processes and procedures and implementation of best practices. But I considered existing measures to be relevant, adequate and sound. Accordingly, I concluded that subject to these measures being maintained, to the threat level not being raised to 'red' (the highest level) and to good situational awareness being exercised by operational personnel from the donor airline, when they are on site, the wet lease should not materially increase risk exposure. My assessment was that there are no substantive safety or security reasons why the proposed wet lease should not proceed as planned.

Wednesday, 27 July 2016


Human performance deficiencies were identified in the investigations of all 11 accidents and human factors overwhelmingly dominated the lists of causes and contributory factors. The study set out to analyze these deficiencies against two separate models, the Dupont ‘Dirty Dozen’ and the Pilot Competencies Model, as described at the outset. All of the accidents had identifiable factors from both models; indeed some accidents appeared to include deficiencies in virtually all of the Pilot Competencies together with many of the ‘Dozen’.

The graph below shows the number of accidents from the original 11, in which each of the markers from the Pilot Competencies Model was found to be deficient:

It is perhaps not surprising that the pilots in every accident exhibited deficient Situational Awareness – had they been more aware of the situation it seems likely that they would have taken more appropriate action to avoid terrain proximity. It is disappointing to see that the pilots in all accidents were also deficient in the application of their procedures but on the other hand this may be encouraging, in that procedural compliance is possibly indicated as a significant factor in the mitigation of CFIT risk. Poor communication was also identified in most (8) of the accidents, supporting a view that CRM (crew resource management) in general and communication in particular, are vital for the avoidance of CFIT.

The next graph illustrates the same data for the Dupont ‘Dirty Dozen’ markers:

Once again Lack of Awareness tops the table, appearing in all of the 11 accidents and reflecting the deficient Pilot Competency of Situational Awareness above. There is no Dupont marker that corresponds to the Pilot Competency of Application of Procedures so we can’t see a correlation but Lack of Communication again appears in the same 8 accidents. Lack of Teamwork and Lack of Assertiveness were identified in over half of the accidents and these two markers are in some ways related. If the Captain in particular is not a ‘team player’ and fails to respect colleagues and their opinions, the FO may become isolated and feel unable to intervene, even to save their own life. This risk is exacerbated by a steep authority gradient in the cockpit. Norms featured as a marker in almost half (5) of the accidents, when pilots developed and employed their own processes, either when they found that the promulgated process was inefficient, ineffective or difficult, or when there was no applicable process for them to employ.

Stress, Pressure, Fatigue, Complacency and Distraction each appeared in only 3 or fewer of the accidents but these conditions are sometimes difficult to identify from investigation reports. Unless the CVR records specific and attributable voice characteristics, or the individual mentions that they are affected by a condition, it may be that the condition goes unnoticed in the investigation.

It is worthy of note that neither model specifically addresses markers for Monitoring and/or Cross-checking, although Lack of Awareness and Situational Awareness respectively could be taken to include those competencies. Deficiencies in monitoring and cross-checking were apparent in several of the accidents. It is precisely these functions, functions we know humans perform quite poorly, that the EGPWS/GPWS/TAWS seeks to augment in CFIT prevention.

Finally, the graph below combines the two models to show the number of markers associated with each of the 11 accidents:

Accidents with a greater number of markers from one model also appear to have a similarly greater number from the other model – total markers per accident varied from 7 to 14 but the variation between models was never greater than 2. This may indicate that the two models identify similar human performance deficiencies but it may also be a reflection of the amount of information available from the accident reports upon which the study was based. These varied from a few pages to well over one hundred.

Friday, 8 July 2016


I saw this excerpt from a book on twitter today, written by a test pilot:

It reminded me of a different book written by the Australian Group Captain Doug Edwards many years ago. He wanted to explore the reasons why high-performing airmen like military display pilots sometimes died in accidents they could have escaped from. Why didn't they eject?

His conclusion was a product of Abraham Maslow's hierarchy of human needs (read about it in Wikipedia). This suggests that humans may fear loss of status and self-esteem just as much as they fear death. High performers can become 'addicted' to their status and will do just about anything to preserve it, struggling to rescue a hopeless situation beyond the point from which they can escape.

Maslow's Hierarchy of Needs

Thursday, 7 July 2016


Almost 30 years ago I was in the right hand seat of a BAC 1-11 narrow-body twin, on approach to Aberdeen. It was around 10 at night, dark, windy and raining - pretty standard stuff. We were on the last of 3 rotations to Heathrow and quite keen to get back to the hotel for a beer and some rest. And the captain had brought his wife along for the week detachment so she was waiting for him...

He was flying down the non-precision approach, with the landing lights glaring onto the clouds in front of the windscreen and the anti-collision beacon intermittently looming orange. At around 500 feet from touchdown I called 'minimum' as dictated by the SOP and there was certainly nothing like a runway in sight. The captain replied - I forget what he said but we continued descending as before and about a hundred feet later the runway lights appeared through the rain and we touched down uneventfully.

It was only later that it dawned on me what had happened - the captain had pressed on below minimum because he didn't want to divert. I was new to civil airline flying and new to the airline so wondered if that was how things were done around here. Of course I quickly learnt that it wasn't.

This all came back to me last week when I read about an A330 that landed off the side of the runway in Kathmandu. Apparently the pilots had trusted the accuracy of the aircraft's GPS based navigation systems enough to continue below the published approach minimum. Of course they were wrong.

To read about the outcome on Skybrary click HERE

Monday, 20 June 2016


I have spent a lot of time unpicking aircraft accidents in the course of my consultancy for IATA’s risk reduction programmes and as an expert witness in liability cases. Whilst that does risk becoming desensitised to the frequently unnecessary tragedy of these events, it has given me a keen insight into what causes pilots to crash their aeroplanes. Perhaps more importantly I believe it has helped me to distill some key factors which could have stopped them happening – three in fact, which I will explain below.

Prevention – generally we would all accept that prevention is by far the best means to avoid accidents, in the air or on the ground. We have countless opportunities to prevent accidents every day and in flight operations this activity is formalised into procedures and checklists. If we adhere to these tried and tested action sequences, the overwhelming majority of flights will be uneventful. Therefore, straightforward procedural compliance can deliver accident prevention virtually every time.

Recognition – in today’s highly reliable aircraft, operating in a well-controlled environment, facilitated by real-time weather, traffic and airspace information it is rare for anything out of the ordinary to penetrate the serene world of the commercial pilot. But if something unusual does happen it is vital that the pilots quickly recognise the deviation, picking it out from the backdrop of countless hours of ‘normal’. This ability to recognise the abnormal must be founded upon a comprehensive knowledge of what normal should look like; what is the acceptable range of values for every critical parameter.

Recovery – having recognised that things are not going to plan, pilots must be able to recover to normal, or at least to a new ‘normal’ within the constraints of whatever has occurred. Then and pretty much only then, do the pilots require real skill.

So that’s it; prevention through rigorous compliance, recognition based on comprehensive knowledge and finally recovery requiring piloting skill. Most of the current generation of airline pilots will probably never need more than the first of these (and that’s worth bearing in mind when hiring and training pilots) but how do we deliver and maintain the knowledge and how do we hone the skills when they may never be needed in the course of an entire career?

Tuesday, 24 May 2016


Many countries are enjoying a remarkable expansion in commercial air traffic as growing populations, increasing wealth and the loosening of sanctions make air travel more attractive and accessible. This growth is being fed by an ever increasing supply of the very latest aircraft, principally from the two big manufacturers, Airbus and Boeing – aircraft equipped with the most modern systems for navigation, communication and engine management.

The manufacturers will argue that all this slick technology will offer operators substantial savings with reduced fuel burn, better aircraft utilisation and simpler, quicker flight crew transitions onto type. They might also argue that some of the new systems make the operation safer by adding layers of defence against mid-air collisions, flight into terrain or unstable approaches. Both of these arguments are correct but ignore the fact that in some cases new technology, or more accurately the way that we humans interact with the technology, has been a factor in the evolution of undesirable conditions and even accidents.

To illustrate this point it is worth going back a few years to the previous generation of jet aircraft. If a system was not performing as expected it was normal practice for the pilot (or more likely the flight engineer) to switch it off and back on in an attempt to recover normal service. Alternatively they might have pulled and reset the relevant circuit breaker, of which there were multiple panels in the cockpit. However, that is no longer necessarily the case. In a recent accident one pilot elected to resolve an intermittent flight control system fault by resetting a circuit breaker in flight, just as he had previously seen an engineer do on the ground. Unfortunately, that particular system takes some time to recover after a reset and in the meantime the flight characteristics of the aircraft changed to the extent that the other pilot lost control and the flight ended in the sea.

In another case, one pilot made a simple numerical error when commencing the take-off performance calculations and used a take-off weight value precisely 100 tonnes less than the actual take-off weight. Partly due to the process by which data was transcribed from the paper loadsheet into a laptop electronic flight bag, and then back from the laptop to a paper flight log and finally from there into the flight management system, this error was never identified, even though there were four pilots in the cockpit at the time. There were probably many other factors at work but the multiple interactions with technology apparently ‘blinded’ the pilots to a substantial and rather obvious discrepancy. The aircraft did eventually get airborne but sustained major damage after scraping its tail along the runway and through the overrun area before lifting off.

So why is it that technology designed to improve efficiency and accuracy can sometimes have the opposite effect? Take-off performance calculations are far more precise when performed by computer software as opposed to the old fashioned tables and graphs. Electronic flight control systems undoubtedly allow for smoother, simpler flying when they are working as designed. So it is not the technology that is at fault, or indeed the concepts that support it. Although in the first case described above there was a technical fault with the flight control system, a cracked electrical solder in fact, if the pilots had managed the deficiency in accordance with the abnormal procedures they wouldn’t have fixed the fault but they probably would have completed the flight.

Part of the problem is that we humans have evolved to chase animals and pick berries and not to operate things with buttons and levers – not yet anyway. Consequently our behaviours and thought processes are based on conditions that are no longer very relevant to the environment in which we work. Unlike just a few years ago, most of a pilot’s job today revolves around monitoring and managing systems, the actual functionality of which he or she would probably struggle to understand. It is sufficient in most cases to know what the input options are and what the outputs can be expected to look like; what goes on in between is of little operational significance. This is the fundamental principle behind claims that pilot training has been simplified, (and that can be interpreted as being quicker and therefore cheaper), by modern aircraft systems.

That is true to a great extent but it brings with it some unwanted baggage. Firstly this lack of understanding of what happens between control selections and the eventual outcomes can become a liability when systems do not function correctly or if conditions are encountered that fall outside the design specification of the system. This was starkly illustrated in a high altitude stall event a few years ago when the flight dynamics were so far removed from what was deemed to be ‘normal’, that a critical stall warning system intermittently suppressed itself, in accordance with its design. The loss of the warning contributed to the confusion amongst the pilots, who could not resolve a number of apparently conflicting pieces of information. If they had fully understood the warning logic they may have had a better chance. The investigation also determined that the pilots had not received training in high altitude stalls, presumably on the basis that the systems would protect them and they would never have to manage such a situation.

Secondly, these modern systems are so reliable that deviations outside of ‘normal’ are incredibly rare. Now that should be regarded as a good thing of course but as a result, and apart from the controlled environment of a simulator, pilots might go many years without ever seeing something significantly ‘abnormal’. This can lead to an over reliance on the aircraft systems and a reduced ability to recognise any undesirable deviations before it is too late to recover. Hour after hour of safe and predictable flight may eventually be interrupted by a sudden and unexpected flight condition and the industry is beginning to acknowledge the debilitating influence of the ‘startle’ effect on pilot performance in an emergency.

This is not a Luddite’s case against modern technology in aircraft; far from it. Terrain awareness systems alone have dramatically reduced the number of inadvertent collisions with the ground and bearing in mind the high fatality rates associated with those accidents, they have undoubtedly saved hundreds of lives. No, the point is that as this new technology sweeps through our commercial aircraft fleets, it radically changes the way that pilots must interact with their aircraft systems, how they think and how they behave. Manufacturers, regulators and operators must accept that fact and ensure that the training offered to pilots – initial, recurrent and upgrade – truly addresses the actual demands of the equipment they are to fly.

Jo Gillespie

Gates Aviation Limited

Friday, 13 May 2016


What can three words do for you? Well these three words can address any location anywhere in the world – no postcode, no zip-code, no street name, no house number. What three words are those? of course!

what3words recognised that millions upon millions of people and locations have no address, so were denied access to many of the services the ‘well-addressed’ population can enjoy. Banking, deliveries, emergency medical help and much, much more. Latitude/longitude references work for the navigationally minded among us but the co-ordinates are complicated and error prone.

This system divides the Earth’s surface into 57 trillion (yes that many) 3 metre squares (we can discuss the spelling of metres later), and assigns each square a unique three word address. Three words are simple to remember and make sense to anyone. It comes in several languages too. Once you download the free app onto your smart-phone or tablet it will direct you to any 3 metre square you like; in the desert, in the jungle, on the ocean – you choose.

Take a look and download the app. From names.masterful.take (see if you can find me).