Seattle Flying Lessons over the Cascade Mountains

Flying Through the "Cascades"

One System Failure Can Trigger Other Systems to FailThe Cascades

System failures can trigger further system failures depending on how something is engineered. This is known as a cascading failure and it can lead to catastrophic results. Non-aviation examples include both the Fukushima Daiichi and Chernobyl nuclear power plant failures. A few seemingly manageable failures caused other failures to occur leading to a domino effect. Aviation examples include United Flight 232 and Qantas Flight 32. During both flights a structural engine failure led to multiple failures that quickly overloaded the flight crew. Luckily, each crew successfully prioritized the multitude of issues and landed such that loss of life was significantly minimized. The number one priority for the crew in each instance was something your instructor has probably said until they are blue in the face: fly the plane. They followed this up by sorting through the multitude of issues and exercising some of the greatest forms of CRM ever seen.

You have probably learned as a pilot that flying the plane isn’t always enough. When things go wrong you still must assess the situation and address how it will affect the outcome of your flight. Early in our training we realize that a simple failure can quickly overwhelm us. Just one amber annunciator can consume our minds and create task saturation. As we progress in our training, we learn to handle more issues. Many of us might think we are good at multitasking, but studies show that we simply aren’t. When something requires you to use your motor skills (flying), the ability to use your cognitive skills is significantly reduced. This is why using your phone and driving is so dangerous. The skill we develop as pilots is how to assess and prioritize multiple issues.

Technology can also play a considerable role in assisting our prioritization. The basic concept behind master warnings and master cautions (red and amber) is to prioritize for us. When you shut down an engine on the ground you get several master warnings and cautions. They can be prioritized with master warnings (fuel pressure, oil pressure) taking precedence over master cautions (low volts). One can imagine this happening in flight. Combine a plane without failure prioritizing with a pilot who has poor systems knowledge and they may find themselves focused on trying to get their alternator back online before they address the engine failure itself. Seems comical as you sit back and relax while reading this, but when you’re occupied in the air with several issues to address at once, it’s easy to develop this tunnel vision.

More advanced aircraft even have monitoring systems that are smart enough to inhibit certain alerts depending on the phase of flight. For example, if you are taking off and the coffee/oven heat fails it wouldn’t be helpful to know as you’re barreling down the runway at high speeds. Instead, it will provide an alert later when the crew is less busy and can apologize for the cold coffee.

Despite such technologies that better assist the flight crew, there are still events in which the crew will become task saturated with a cascade of alerts and will be forced to address which ones need to take priority. These are usually demonstrated in simulators to evaluate a crew’s ability to effectively manage high stress situations and work together towards a successful outcome. It also serves to test their knowledge of systems and how they are intertwined.

The moral of this story? Expanding your knowledge of aircraft systems may ultimately pay off someday. We hope it never comes to a serious emergency, but the more you learn and the better understanding you have of an aircraft’s systems the better prepared you will be. This begs the question, “how much knowledge is enough?” The answer: you can always learn more. That isn’t to say you should be able to build your aircraft from a pile of parts and make it fly. We just can’t do that in flight. Delving further into systems however, does promote a level of thinking and understanding that can only help, not hurt you in the future. It can be overwhelming learning systems and it takes time. You can start by asking yourself: “What happens when I flip this switch?” Here’s a hint, the answer should never be: “Because the checklist says so.”

 

Back To Blog