Are you a pilot who turns down the radio’s volume and does a straight-in at an uncontrolled airport when there are four other aircraft neatly spaced in the traffic pattern? Do you think your lungs are so good that you can cruise at 15,500 feet MSL without supplemental oxygen? Are you convinced that you’re experienced enough to avoid using checklists? If so, you may be displaying some of the characteristics that aviation psychology researchers suggest can increase the chances of an accident.
Ever since the early days of aviation, when the Civil Aeronautics Board (forerunner of the FAA) put in place the first rules and regulations to bring order to the barnstormers, federal regulators have been trying to identify and isolate those pilots who are destined to jeopardize public safety by causing accidents.
The Federal Aviation Regulations have evolved into a complex web within which all pilots are expected to function. In theory, if everyone does everything exactly as written and interpreted by the bureaucrats and FAA Administrator, public safety is assured. But this is the real world. The FAA recognizes that pilots, like all humans, have different personalities, behaviors and intelligence, attitude and skill levels. Research by the FAA’s Civil Aeromedical Institute and such institutions as Ohio State University and the University of Illinois is promoting an understanding of human factors in aviation, how those factors affect learning and decision making, and how they may be precursors to accidents.
Ohio State University has developed a risk-management tool for GA pilots that’s designed to help them evaluate their ability levels and develop personal guidelines. The idea is to preview each flight using a set of standards, which covers such areas as personal minimum cloud ceilings and minimum number of sleep hours the night before. Richard Jensen, Director of the Aviation Psychology Lab at Ohio State, notes that pilots in New York City or Los Angeles would need to develop personal criteria related to traffic considerations, while pilots in mountain areas would need to focus on terrain and wind-pattern issues. He also notes, “You can’t be a safe pilot unless you’re continuing to learn.”
Scott A. Shappell, of the FAA, and Douglas A. Wiegmann, of the University of Illinois at Urbana-Champaign’s Institute of Aviation, proposed that unsafe acts by pilots can be loosely classified into two categories: errors and violations.
Errors are the mental or physical activities that fail to achieve the desired outcome. There are three basic types of errors: skill-based, decision-based and perceptual. In skill-based errors, two pilots may have the same training and same number of flight hours, yet possess vastly different skills. Pilot #1 may have trouble turning to headings and capturing altitudes, while pilot #2 can make any airplane soar like an eagle. Decision-based errors are a result of intentional behavior. Sometimes they’re excused as “honest mistakes,” while at other times they reflect a lack of basic knowledge. These errors can involve using wrong procedures, making a poor choice from known options, or incorrectly gathering information to evaluate a situation that’s new to the pilot. Perceptual errors are made because observations have been affected by illusions, spatial disorientation or preconceptions.
Violations constitute willful disregard for the rules, regulations and procedures that govern the safety of flight. Many in aviation believe that the FAA regulations are so complex and subject to interpretation that you can’t conduct any flight without violating some rule. An unsafe pilot believes that because you routinely violate some obscure FAR anyway, or are philosophically opposed to government regulation, you might as well ignore everything the FAA has promulgated. Flying while your medical certificate is in your wallet, which you left in the glove compartment of your car, is a violation, but not a safety issue. Flying in night/IFR conditions when you’re not instrument-rated is a violation and a very serious safety hazard. Even when there’s a safety issue, there’s an enormous difference between someone who deviates from a regulation or procedure because it’s expedient and someone who deviates because he or she resents authority.
Not even in military aviation, where discipline is fundamental, can every square human be made to fit into every regulatory round hole. In a five-year period ending in 2007, the Air Force found that “lack of discipline” was a factor in 23 Class A aircraft mishaps. Class A mishaps involve loss of life, injury resulting in permanent disability, destruction of an Air Force aircraft and/or property damage exceeding $1 million.
The FAA doesn’t require a psychological workup of applicants for pilot certificates. The closest it comes is encouraging its aviation medical examiners to engage applicants in conversation to see if there’s anything unusual going on in their lives. The agency does consider the abuse of illegal drugs and alcohol to be a warning sign of behavioral and personality issues and requires that each applicant consent to a search of the National Driver Register for DWI and/or other convictions.
Unsafe acts can be loosely classified into two categories: errors and violations.
The NTSB recently completed a statistical analysis of the GA accidents occurring in 2003. It found that of the 1,635 accident pilots for which data was available, 46% had 1,000 or fewer flight hours and 16% had 200 or fewer flight hours. Of the 1,635 accident pilots, 27% had more than 3,000 hours and 16% had more than 8,000 hours. About 41% of GA accident pilots had 100 or fewer hours in the aircraft make and model. About 14% of accident pilots had more than 1,000 hours in the make and model.
It’s easy to understand why accident risk would be relatively high when you don’t have much experience, especially in a particular aircraft. More difficult to understand is why accident risk also spikes at the high end of the hourly spectrum. The implication is that pilots are as much an accident risk when they’ve amassed 8,000 hours as before they crossed the 200-hour mark. It may be that some high-time pilots believe they’ve reached a point where it’s not necessary to keep learning. They may also believe that their experience and superior skills will more than compensate for any careless acts or muddled thinking. Using the same reasoning, they may believe that there’s no need to find out why other pilots crash because they’re too experienced to make the same mistakes.
Reports to NASA’s Aviation Safety Reporting System (ASRS) occasionally give us a glimpse at pilots who substitute experience for good judgment. For example, the first officer of a Gulfstream IV reported, “Thunderstorms in the area…lost the #1 generator. I started the auxiliary power unit so we could get the backup generator. The captain was upset with me for starting the APU without [getting his] permission. The captain was in a very upset mood…yelling at everything…about five minutes from airport, in total IMC, the captain got on the radio and cancelled IFR…when at midpoint on the runway we got hit with rain from a thunderstorm.” The first officer for an airline told ASRS about a captain who wasn’t adhering to some approach profiles and was flying too low on approach without having the gear and flaps set for landing. The captain is 25 years older than the first officer, and the first officer “kicked himself” for not speaking up.
One of the most deadly categories of GA accidents is VFR flight into IMC. Britain’s Civil Aviation Authority identified underlying human factors associated with weather-related accidents as a pilot’s reluctance to admit to having limited capability and a lack of appreciation for real dangers. University of Illinois researchers Juliana Goh and Douglas Wiegmann found that it’s possible to predict whether a pilot is likely to continue flying into deteriorating weather if you know enough about a person’s willingness to take risks, knowledge of pilot errors and weather hazards, and ability to self-evaluate skills and judgment.
James Reason, psychology professor at England’s University of Manchester, has studied human error in aviation and other industries. He finds that you need to identify and correct a person’s unsafe acts and elements of the system that may encourage such acts. Unsafe acts, when looked at in the context of the person, are due to problematic mental processes, such as recklessness, negligence, carelessness, inattention and poor motivation. But Reason is concerned that too much focus on the person makes it difficult to find systemic problems that may set the stage for recurring errors. That may be fine for airlines, where pilots work within a complex system, but, for the most part, GA pilots fly unsupervised and must be capable of self-awareness and self-evaluation.
The NTSB tells the story of a low-time pilot who should have had a much better appreciation for his lack of experience and, therefore, capabilities. The private pilot had logged 91 hours total time with nine hours in a Cessna 182T. He took off from Sikeston Memorial Airport in Sikeston, Miss., on a night/VFR flight to the Spirit of St. Louis Airport in Chesterfield, Miss., with three passengers on board. The pilot radioed the tower that he was 11 miles south and was inbound. The flight was cleared to land on runway 26R. The wind was calm, visibility was five miles in mist and there were a few clouds at 7,500 feet. The airplane never touched down on the runway. Instead, the pilot radioed the tower to advise that he was going around. The controller cleared the pilot to make right traffic to runway 26R. The pilot confirmed, which was the last transmission. The airplane struck trees, crashed and burned. All four occupants were killed.
The NTSB reported that the pilot didn’t maintain airspeed and allowed the airplane to stall during the go-around. The NTSB’s report on the accident included an e-mail written by the pilot’s instructor to the management of the flight school at which the pilot had been taking lessons. It said, in part, “He was very anxious to solo, a little too anxious in my opinion. I told him that I would solo him when I felt he was safe, and [he] seemed a bit frustrated that there may be a chance that I may delay his solo. This attitude worried me because I feel that a student pilot should trust his instructor’s judgment. Throughout the rest of the training, [he] began showing complacency in the airplane. I would stress to him the importance of using checklists, yet he wouldn’t use them unless I made him.” The instructor continued, “I’m worried about his complacent attitude toward flying and am expressing my concern for his safety [after he takes his] checkride.”
More difficult to understand is why accident risk also spikes at the high end of the
In another case, a private pilot who didn’t have an instrument rating or a sign-off to fly complex aircraft took off in a Piper PA28R-200 “Arrow” on a night flight from Longmont, Colo., to Las Vegas, Nev., with three passengers. She had 144 hours with six in type. She filed a VFR flight plan with a cruise altitude of 15,500 feet MSL and 140 knots airspeed. In the flight plan, the pilot said the airplane had eight hours of fuel, and the flight would take four hours. In fact, the airplane could carry 50 gallons and had a normal burn rate of 12 gph. With a reserve of 45 minutes, usable fuel would last about 3 hours and 15 minutes.
The airplane crashed while the pilot was attempting a forced landing near La Sal, Utah, and all four occupants were killed. Radar data showed that the airplane had been cruising at altitudes of up to 16,000 feet MSL. The airplane wasn’t pressurized, and there was no supplemental oxygen on board. The pilot was in contact with ATC for flight following and made numerous navigational errors that were corrected by ATC. Some of her headings were off by up to 70 degrees. The pilot’s flight instructor, who was an FAA safety counselor, told investigators that the pilot “always seemed to be in a hurry.” A friend who had previously flown with her in the accident airplane reported that she had trouble finding the master switch. The NTSB said the probable cause of the accident was fuel starvation, the pilot not following fuel-management procedures and lack of adequate preflight planning and preparation. Hypoxia was also a factor.
In both of these cases, the NTSB found evidence that others had observed behaviors of the pilots that were indicative of deficiencies in the way they approached piloting. Continued research may yield methods for helping pilots identify personal safety issues on their own so they can take corrective action that will prevent accidents.