Jan
18
2020
|
|
Posted 4 years 321 days ago ago by Randy Mains
|
|
When I teach and facilitate my one-day crew resource management (CRM) course or my five-day CRM train-the-trainer course, attendees often find it difficult to bridge the gap between the human factors they learn to recognize in the course and actually putting their knowledge into practice in the real world to prevent an accident.
“See it, Say it, Fix it” is the most succinct and easiest to understand definition of CRM that I have heard in the decades I have been a CRM instructor and facilitator.
While you read this article I challenge you to practice your CRM skills. Consider the excerpts; red flags I’ve extracted from the recent NTSB 52-page report CEN19FA072 entitled “Group Chairmen’s Factual Report Operational Factors/ Human Performance” taken from their investigation of the fatal Survival Flight crash that killed three air medical crewmembers January 2019 in Iowa. (I wrote an article entitled “Just Say No!” detailing this accident in the March / April 2019 issue of this magazine.)
Try to explain why the following are red flags. Why are they dangerous human factors that could be a link in an error chain (or a hole in James Reason’s “Swiss cheese” model) leading to that accident. Your awareness—and everyone in your organization’s awareness—of what human factors can hurt you is in essence what CRM is all about. Once identified, that awareness should be carried forward (fixed) to take steps to break that link in the error chain (or plug that “Swiss cheese” hole) before it leads to an accident, which can be either active or latent threats in the “Swiss cheese” model. After you successfully identify why those red flags are a danger, ask this important question: “Does this situation or culture exist in my organization? If it does, it needs to be fixed right-this-minute.
Now, put on your “CRM antenna” (which means be vigilant to an error chain forming) while acting as an aviation forensic scientist as I highlight events taken from the NTSB report of that fatal accident.
Helicopter shopping is the potentially deadly practice of calling another flight program to see if they will accept a flight that another program turned down. It was considered a contributing factor in the Iowa crash because two other flight programs turned down the request due to bad weather. The NTSB uncovered a derivative to helicopter shopping, something called reverse helicopter shopping.
On page 16 under the heading, “Reverse Helicopter Shopping” one pilot stated:
They (the Operational Control Center) specifically told me, they were looking at weather turndown and there's one that was turned down out of Pittsfield, Illinois. “We were going to call that hospital and see if you wanted to take it.”
Why is this a red flag?
On page 39, section 4.17 under the heading “Safety Culture” it’s noted:
Several former employees had stated that they received multiple texts from current company pilots and med crew stating they were “scared to fly.” One nurse stated that she believed the pilots were safe but the company (administration and management) were unsafe. Several pilots highlighted a lack of transparency by the company on safety issues.
Why is this red flag?
Section 4.17.1 “Pressure to Attempt Flights” reads:
A pilot that had relocated to open the Columbus bases said there was “an awful push to get numbers … it was like they created an environment that felt like a competition, especially when [base] 14 opened up.” He stated that the vice president (VP) of emergency medical service (EMS) stated their flight volume was going to be 150 flights a month, where this pilot considered 30-35 flights per month to be realistically achievable in the new environment. Company management motivated bases to conduct flights by purchasing a massage chair for the base if they flew 30 flights in one calendar month. The count of flights per month was kept on the safety board in the SF14 base. According to the company’s monthly summary, the accident flight was the 26th flight the base would have completed in January.
Why is this a red flag?
Section 4.17.1.3. “Expected Launch Time” reads:
Pilots and medical crew stated that the company management wanted pilots to be off the pad within 7 minutes of getting a call for a flight. If the aircraft was not off the ground in 7 minutes, pilots were expected to fill out an “occurrence log” to explain to the DO why they didn’t lift off within 7 minutes.
Why is this a red flag?
Section 4.17.1.4. “Pressure from management to accept flights” reads:
Numerous pilots and medical crew indicated incidents where they were the recipient of or witnessed a pilot being reprimanded or challenged for declining a flight. One medical crewmember said, “The chief pilot of the company… would call within about 10 minutes and would cuss out our pilots and belittle them, … saying, … we need to take these flights…. he would yell so loud on the phone that you could hear it, … just standing within earshot.” He continued to say that the chief pilot told the pilot that if the base failed, it would be his fault because he was turning down flights.
Why is this red flag?
In another case, a pilot declined a flight for instrument conditions:
The lead pilot confronted him about why the pilot declined and said that the reporting station that was indicating IFR was faulty and that the pilot should have attempted the flight.
Why is this red flag?
Section 4.17.2 “Bases not allowed to go out of service” reads:
Several current and former pilots relayed concerns that they were never able to issue a “red” risk assessment and take the base out of service for any reason including, maintenance, fatigue, or weather.
One pilot stated that if that happened the “ owner would be calling blowing up our phones, hey, …why are you guys still out of service, why are you still out of service, where is the mechanic, is the mechanic working on it, is he done with that inspection yet or -- put the cowlings back on. Put it back together. We got to get back -- this is the culture of like ‘hurry, hurry, hurry, we cannot be out of service for anything.’”
Why is this red flag?
I have only listed a few of the incidents from this report as a way to highlight the importance human factors play in the formulation of an accident. I cannot think of anything more tragic (and frustrating) than holes in James Reason’s “Swiss cheese” model of identified factors, but no action taken to address them. Rachel Cunningham, the flight nurse who lost her life in that crash voiced her concerns in a six-page email to her management 13 months before the accident, entitled “Accident Nurse Letter to Human Resources” (listed by the NTSB in their report as Attachment 4).
I am heartened; however, to see the NTSB looking into the human factors relating to this accident because examining the human factors of a helicopter accident isn’t normally done. The NTSB should examine the human factors in every HAA accident as is done in every airline accident because the data accumulated by Dr. Ira Blumen and his Opportunities for Safety Improvement study showed that 94% of HAA accidents have an element of human error. For that reason, it came as no surprise to me that the Survival Flight crash in January was no exception.
Randy Mains is an author, public speaker, and a CRM/AMRM consultant who works in the helicopter industry after a long career of aviation adventure. He currently serves as chief CRM/AMRM instructor for Oregon Aero. He may be contacted at [email protected].