“We’ll get that straight when we get airborne”

It’s better to look a little stupid now, than a lot stupid later.

The US NTSB has released full investigative notes on last year’s crash of US Airways flight 1702, an A320 taking off from Philadelphia International airport. We can learn a lot from this crash of a fully airworthy Airbus. It actually got into the air, and then the captain decided to force it back onto the runway. There was substantial damage. It departed the runway. I’m glad all 149 passengers and 5 crew exited the aircraft via emergency exits with no serious injuries. But the jet didn’t look good:

1702

Airline management suggested afterwards that maybe a tire had burst at high speed, and the crew saved the day. That was not the case. The tires, indeed the whole nosegear assembly, was broken by the unusual high-speed post-rotation ‘reject’. The reasons for the accident were, as usual, multiple. The end case was that the takeoff performance speeds were not set in the computer, the A320 didn’t know what V1, Vr or V2 were. This was caused by a complex series of non-optimal machine/human interfaces, computer software, human/human interaction, operational pressures, and possibly medical issues. (The captain reported for duty without allowing the FAA recommended time to pass after taking two prescription medications: midazolam, a sedating drug, and fentanyl, a narcotic used as part of medical procedures.) There is a lot to study here, especially for Airbus pilots.

But the overriding lesson is simple. If jolly hard to do. It’s better to look a little stupid now—reject the takeoff at low speed, taxi back, enter the V speeds, try again—than it is to look a lot stupid later. Like maybe crash.

At the start of the takeoff roll, when the airbus spoke up its warning about the lack of V speeds, the audible alarm saying “Retard” [the thrust levers], the pilots decided to ignore it. From the cockpit voice recorder (CVR) transcript:

retard

retard

“What did you do? you didn’t load. we lost everything.”

retard

retard

retard

“No”

retard

retard

retard

“We’ll get that straight when we get airborne.”

retard

retard

“Wh*. I’m sorry.”

retard

retard

retard

Within the next fraction of a second, the CVR captured the ‘sound of decreased background noise, similar to power reduction’ and then the ‘sound of first impact’.

1702 2

Ouch.

The pilots knew something was not right at the start of the takeoff. But it would look bad to admit error to ATC, the flight attendants, the passengers and the company. They might have to go back to the gate to get more fuel. They might have to fill out paperwork. They might have to talk to the chief pilot. Never mind that the errors are all wrapped up in system design, training, operational practices, and more. They would look bad. Stupid pilots. So they pressed on. Until the captain got so scared he forced the just about airborne jet back onto the end of the runway.

So while Airbus, the FAA, American Airlines training and many more busy suits work on fixing the software and procedures that led to the situation, us pilots can take away an important lesson: It’s better to look a little stupid now, than a lot stupid later.

But good luck always following through on my motto. In the real world, it’s easy to say it, but tough to do it.

Like really tough.

Peter Isler’s sailing secrets

Peter Isler won the America’s Cup twice, and has a long amazing biography of sailing achievements. He also wrote a book of sailing secrets. I just read it, looking for seamanship secrets we can use as pilots. And guess what, the secrets really aren’t that secret! There were some ideas that directly translate to flying. Here’s one that resonated:

Behind every great sailor is an awful lot of time spent practicing the basics back home—putting in the long hours in the cold of the spring and fall off-season—honing the skills they need to win on the racecourse.

The same applies for all of us trying to improve. There’s no fast track to winning. It takes practice. It takes preparation. It takes working out the routines. Over and over again. Doing your homework by taking the time to practice is a key ingredient in becoming an accomplished sailor.

Peter Isler

Later in the book he says:

When talking about an America’s Cup campaign, Dennis Conner pointed out, “Time is not your friend.” There is only so much you can do to prepare before the race begins. So whatever the amount of time you have carved out for your sailing practice, try to make time your friend by setting realistic goals and using whatever time you are on the boat wisely for full-on practice, from the very moment you leave the dock.

Peter Isler

There’s also a bunch of stuff on tying knots and sails.

Sailing Secrets

 

Do you wear a watch?

watch

Airplane owners always say “safety is number one,” but how many encourage us to not wear a watch?

Captain Richard de Crespigny is a former military pilot, and was PIC of QF32, the A380 that suffered massive damage after engine #2 exploded. The quote is from his fantastic book, QF32.

Blame the pilots?

Air Force Times article from two days ago sums up the results of a six-week investigation of the October 3rd bombing of a Doctors Without Borders hospital in Afghanistan that killed 30 people – Crew Blamed.

Air Force Times

However the news story also notes deeper systematic issues, a culture that places pilots into impossible situations. But you can’t discipline a procedure or a rule or a culture. And no general is going to fall on his sword when there are crew that can be blamed for their ‘human error’.

 “The direct result of human error, compounded by systems and procedural failures.”

Army General John Campbell,
Top U.S. commander in Afghanistan.

The news story lists worrying items including:

  • U.S. special operations commander “lacked the authority to direct the aircrew to engage the facility”.
  • The AC-130 aircraft that fired on the hospital had launched urgently in response to a report of troops under fire. As a result, the crew did not conduct a normal mission brief nor obtain “crucial mission essential related materials” that would include the “no-strike” list identifying the location of the hospital as being off limits.
  • During the flight, the aircraft’s onboard electronic systems malfunctioned, resulting in a breakdown of some essential command and control capability, such as eliminating the aircraft’s ability to transmit video, and send and receive email.
  • In the air over Kunduz, the aircraft conducted an evasive maneuver that forced it to move “away from its normal orbit … this degraded the accuracy of certain targeting systems which later contributed to the misidentification of the MSF trauma center”.
  • When the targeting systems malfunctioned, the AC-130 aircrew “visually located the closest, largest building” and found that it “roughly matched” the physical description of the building that U.S. SOF commanders said was the proper target. “At night, the aircrew was unable to identify any signs of the hospital’s protected status”.
  • About one minute before the aircraft began firing on the hospital and despite the breakdown in some of the aircraft’s communications systems, the aircrew transmitted to their operational headquarters at Bagram Airfield the coordinates for the MSF trauma center as their target. The headquarters had access to the no-strike list, which included the hospital, but did not realize that the grid coordinates for the target matched a location on the no-strike list or that the aircrew was preparing to fire on the hospital.
  • The aircrew continued repeated strikes on the hospital target despite telling investigators that they did not observe hostile activity at the MSF trauma center.
  • During the 29-minute assault on the hospital, the aircraft’s targeting system began functioning correctly and identified the correct target, but “the crew remained fixated on the physical description of the facility” and disregarded the grid coordinates.
  • Twelve minutes after the assault on the hospital began, U.S. operations forces received a call from MSF saying the hospital was under attack. But that information was not relayed to the aircrew before the AC-130 had completed its strike and departed the area.

Who reads all that, and thinks the first thing we should do is blame the gunship operators?

We’ve seen all this before. In 1994 two USAF F-15’s shot down US Black Hawk helicopters over Northern Iraq in a friendly fire accident. snookThe F-15 pilots were faulted for misidentifying the helicopters as hostile. It’s commonly referred to as the Black Hawk Incident (Wikipedia page). It was the basis of an outstanding book by Scott Snook titled Friendly Fire. The book, taking a lead from Perrow’s Normal Accident paradigm uses systems theory and organizational behavior to show how such accidents can, and will, happen. We can do things to prevent them from happening, but they are at the system level not the blame the pilot level. Resilient organizations learn to absorb human errors and other disturbances, and still move forward.

At the pilot level, what can we do to lesson the chance of being caught in a Normal Accident, a system fault? Well, it’s tricky! But be prepared to push back against rushing. Be prepared to be known as a PITA (Pain In The Arse) that writes up conflicting organizational policies and procedures. Question authority. Learn your job, and the jobs of others you interact with, as fully as you can. And good luck! Because the organization will never fail to find a scapegoat: Blame the crew.

scapegoat

 

Standard checklists

Despite the emergency checklists provided for abnormalities, it’s the standard checklists that you use before you begin your flight that often determine whether you live or come crashing down in a pile of mistakes.

Erika Armstrong
from her new book ‘A Chick in the Cockpit.’

Chick in the Cockpit

The book has some good flying stuff in it, but is more about her personal life journey. One of the most engaging books I’ve read this year.