Blame the pilots?

Air Force Times article from two days ago sums up the results of a six-week investigation of the October 3rd bombing of a Doctors Without Borders hospital in Afghanistan that killed 30 people – Crew Blamed.

Air Force Times

However the news story also notes deeper systematic issues, a culture that places pilots into impossible situations. But you can’t discipline a procedure or a rule or a culture. And no general is going to fall on his sword when there are crew that can be blamed for their ‘human error’.

 “The direct result of human error, compounded by systems and procedural failures.”

Army General John Campbell,
Top U.S. commander in Afghanistan.

The news story lists worrying items including:

  • U.S. special operations commander “lacked the authority to direct the aircrew to engage the facility”.
  • The AC-130 aircraft that fired on the hospital had launched urgently in response to a report of troops under fire. As a result, the crew did not conduct a normal mission brief nor obtain “crucial mission essential related materials” that would include the “no-strike” list identifying the location of the hospital as being off limits.
  • During the flight, the aircraft’s onboard electronic systems malfunctioned, resulting in a breakdown of some essential command and control capability, such as eliminating the aircraft’s ability to transmit video, and send and receive email.
  • In the air over Kunduz, the aircraft conducted an evasive maneuver that forced it to move “away from its normal orbit … this degraded the accuracy of certain targeting systems which later contributed to the misidentification of the MSF trauma center”.
  • When the targeting systems malfunctioned, the AC-130 aircrew “visually located the closest, largest building” and found that it “roughly matched” the physical description of the building that U.S. SOF commanders said was the proper target. “At night, the aircrew was unable to identify any signs of the hospital’s protected status”.
  • About one minute before the aircraft began firing on the hospital and despite the breakdown in some of the aircraft’s communications systems, the aircrew transmitted to their operational headquarters at Bagram Airfield the coordinates for the MSF trauma center as their target. The headquarters had access to the no-strike list, which included the hospital, but did not realize that the grid coordinates for the target matched a location on the no-strike list or that the aircrew was preparing to fire on the hospital.
  • The aircrew continued repeated strikes on the hospital target despite telling investigators that they did not observe hostile activity at the MSF trauma center.
  • During the 29-minute assault on the hospital, the aircraft’s targeting system began functioning correctly and identified the correct target, but “the crew remained fixated on the physical description of the facility” and disregarded the grid coordinates.
  • Twelve minutes after the assault on the hospital began, U.S. operations forces received a call from MSF saying the hospital was under attack. But that information was not relayed to the aircrew before the AC-130 had completed its strike and departed the area.

Who reads all that, and thinks the first thing we should do is blame the gunship operators?

We’ve seen all this before. In 1994 two USAF F-15’s shot down US Black Hawk helicopters over Northern Iraq in a friendly fire accident. snookThe F-15 pilots were faulted for misidentifying the helicopters as hostile. It’s commonly referred to as the Black Hawk Incident (Wikipedia page). It was the basis of an outstanding book by Scott Snook titled Friendly Fire. The book, taking a lead from Perrow’s Normal Accident paradigm uses systems theory and organizational behavior to show how such accidents can, and will, happen. We can do things to prevent them from happening, but they are at the system level not the blame the pilot level. Resilient organizations learn to absorb human errors and other disturbances, and still move forward.

At the pilot level, what can we do to lesson the chance of being caught in a Normal Accident, a system fault? Well, it’s tricky! But be prepared to push back against rushing. Be prepared to be known as a PITA (Pain In The Arse) that writes up conflicting organizational policies and procedures. Question authority. Learn your job, and the jobs of others you interact with, as fully as you can. And good luck! Because the organization will never fail to find a scapegoat: Blame the crew.

scapegoat

 

Leave a Reply