Volume 34 Number 5
Flight of Failure
By Michael Desmond | May 2019
Two fatal crashes in five months. Not since the Dehavilland Comet in 1954 (bit.ly/2D0z51d) has a new commercial jet airliner suffered this kind of calamity so early in its life. A pair of Boeing 737 MAX aircraft, introduced just last year, crashed minutes after takeoff, when Lion Air 610 (LI610) plunged into the Java Sea off Indonesia in October 2018 and Ethiopian Airlines 302 (ET302) crashed just outside of Addis Ababa in February. In all, 346 lives were lost.
I’ve written about airline accidents before (“Chain of Disaster,” msdn.com/magazine/mt573708), exploring the role that automation and instrumentation played in the 2009 crash of Air France 447. And in the recent 737 MAX accidents, some of the same issues are cropping up. Commercial aviation has become significantly—even remarkably—safer over the past 30 years, but when accidents do happen, it’s often at the intersection of automated systems and the pilots who command them.
In the case of the 737 MAX, a system introduced to ease the aircraft’s nose down at certain points of flight malfunctioned at the worst possible moment—just after takeoff. The Maneuvering Characteristics Augmentation System (MCAS) was designed to make the 737 MAX behave more like its predecessor, the widely deployed 737NG, by counteracting a tendency in the new jet to at times pitch up when under power with the nose high. The goal was to make the 737 MAX and NG so similar to fly that no simulator training was needed for NG pilots, potentially saving airlines millions of dollars.
And there’s the rub: MCAS had to be pretty much invisible. Adding instrumentation to the cockpit or publishing procedures specific to erroneous MCAS activation could mandate simulator training. So when an angle of attack (AoA) sensor failed on LI610 and tricked MCAS into thinking the plane’s nose was way too high soon after takeoff, it caught the crew unprepared. The pilots repeatedly countermanded the nose-down trim from MCAS, only for the system to redouble the down force moments later, ultimately resulting in a high-speed dive.
Five months later ET302 encountered the same phenomenon. Investigators reported that the crew responded appropriately, disabling the trim motors in the tail to keep MCAS from forcing the nose further down. But now the pilots had to wrestle with a physical trim wheel, using cables to move the stabilizers at the back of the plane. Unfortunately, at the high speed the jet was traveling, aerodynamic forces made this impossible. When they re-engaged the trim motors in a bid to raise the aircraft’s nose, MCAS almost immediately pushed the nose down and the crew lost control.
The investigation into the accidents is ongoing, but hard lessons are emerging. Among issues being discussed in its implementation:
Data validation: The AoA sensor data in each accident was clearly anomalous, yet MCAS remained activated throughout each event. By contrast, automated systems like autopilot typically fall back to manual control when fed incoherent data.
Redundancy: MCAS activated based on input from just one of the two AoA sensors on the 737 MAX fuselage. Implementing a second sensor would significantly reduce the risk of inadvertent activation.
Conditional code: MCAS activated in response solely to high AoA values. Adding logic to account for airspeed, altitude and pilot input might prevent unwanted activation—for instance, MCAS could be coded to disable if a pilot commands trim up after an MCAS input.
Mission scope: MCAS has been described as an enhancement to make the 737 MAX fly and feel to pilots like a 737NG. Yet the software repeatedly counteracted pilot commands and ultimately trimmed to maximum deflection.
Transparency: MCAS had no UI—no visual or aural indication that it had activated or encountered a failure state—and no published failure checklist for pilots. The assumption was pilots would interpret unwanted MCAS activation as a runaway trim motor and use that checklist to shut down the motor.
While the flaws in MCAS are likely resolvable, the larger question remains of how this system ended up in an FAA-approved aircraft design. I expect the lessons of the 737 MAX and MCAS will inform software and systems development for decades to come.
Michael Desmond is the Editor-in-Chief of MSDN Magazine.