Sunday, June 1, 2008
Do You Have An Accident Personality?
Making sense of accident risks
Not even in military aviation, where discipline is fundamental, can every square human be made to fit into every regulatory round hole. In a five-year period ending in 2007, the Air Force found that “lack of discipline” was a factor in 23 Class A aircraft mishaps. Class A mishaps involve loss of life, injury resulting in permanent disability, destruction of an Air Force aircraft and/or property damage exceeding $1 million.
The FAA doesn’t require a psychological workup of applicants for pilot certificates. The closest it comes is encouraging its aviation medical examiners to engage applicants in conversation to see if there’s anything unusual going on in their lives. The agency does consider the abuse of illegal drugs and alcohol to be a warning sign of behavioral and personality issues and requires that each applicant consent to a search of the National Driver Register for DWI and/or other convictions.
Unsafe acts can be loosely classified into two categories: errors and violations.
The NTSB recently completed a statistical analysis of the GA accidents occurring in 2003. It found that of the 1,635 accident pilots for which data was available, 46% had 1,000 or fewer flight hours and 16% had 200 or fewer flight hours. Of the 1,635 accident pilots, 27% had more than 3,000 hours and 16% had more than 8,000 hours. About 41% of GA accident pilots had 100 or fewer hours in the aircraft make and model. About 14% of accident pilots had more than 1,000 hours in the make and model.
It’s easy to understand why accident risk would be relatively high when you don’t have much experience, especially in a particular aircraft. More difficult to understand is why accident risk also spikes at the high end of the hourly spectrum. The implication is that pilots are as much an accident risk when they’ve amassed 8,000 hours as before they crossed the 200-hour mark. It may be that some high-time pilots believe they’ve reached a point where it’s not necessary to keep learning. They may also believe that their experience and superior skills will more than compensate for any careless acts or muddled thinking. Using the same reasoning, they may believe that there’s no need to find out why other pilots crash because they’re too experienced to make the same mistakes.
Reports to NASA’s Aviation Safety Reporting System (ASRS) occasionally give us a glimpse at pilots who substitute experience for good judgment. For example, the first officer of a Gulfstream IV reported, “Thunderstorms in the area…lost the #1 generator. I started the auxiliary power unit so we could get the backup generator. The captain was upset with me for starting the APU without [getting his] permission. The captain was in a very upset mood…yelling at everything…about five minutes from airport, in total IMC, the captain got on the radio and cancelled IFR…when at midpoint on the runway we got hit with rain from a thunderstorm.” The first officer for an airline told ASRS about a captain who wasn’t adhering to some approach profiles and was flying too low on approach without having the gear and flaps set for landing. The captain is 25 years older than the first officer, and the first officer “kicked himself” for not speaking up.
One of the most deadly categories of GA accidents is VFR flight into IMC. Britain’s Civil Aviation Authority identified underlying human factors associated with weather-related accidents as a pilot’s reluctance to admit to having limited capability and a lack of appreciation for real dangers. University of Illinois researchers Juliana Goh and Douglas Wiegmann found that it’s possible to predict whether a pilot is likely to continue flying into deteriorating weather if you know enough about a person’s willingness to take risks, knowledge of pilot errors and weather hazards, and ability to self-evaluate skills and judgment.
James Reason, psychology professor at England’s University of Manchester, has studied human error in aviation and other industries. He finds that you need to identify and correct a person’s unsafe acts and elements of the system that may encourage such acts. Unsafe acts, when looked at in the context of the person, are due to problematic mental processes, such as recklessness, negligence, carelessness, inattention and poor motivation. But Reason is concerned that too much focus on the person makes it difficult to find systemic problems that may set the stage for recurring errors. That may be fine for airlines, where pilots work within a complex system, but, for the most part, GA pilots fly unsupervised and must be capable of self-awareness and self-evaluation.
Page 2 of 3