The aviation industry sure loves its statistics—there's an X% chance of this, and one aircraft is Y times safer than Z. But what if you were told that just about everything you've heard about aviation accident statistics isn't true? Most pilots feel pretty good about the commonly published statistics claiming that all types of air travel are safer than driving. But if the numbers are presented in a certain way, general aviation flying can appear more dangerous than driving. Before you throw the magazine across the room and denounce such claims as ludicrous, let's take a look at the facts.
Before you blindly believe the studies and the plethora of numbers associated with them, there are some things you should know and think about. It should be no surprise that aircraft manufacturers want to paint their aircraft in the most favorable light possible; after all, they're trying to sell airplanes. And the media seems to chronically harp on the gloom and doom of flying. The point is, we have to be careful about the statistical games being played by people with different motivations. Statistics can and often are manipulated to make things look better or worse than they really are. For example, one study claimed that kids who weighed more were smarter. Wow. Maybe you should let them eat more fast food, right? Well, no. The truth behind this finding is that kids who are older (and, thus, have more years of education) weigh more. You've got to get the whole picture—don't necessarily take things at face value. Such nonsense is probably what prompted British politician Ben Disraeli to note, "There are three kinds of lies: lies, damned lies and statistics."
One big problem with comparing aviation accident statistics to driving accident statistics is that you're really comparing apples to oranges. Aviation accident rates are normally shown as an amount per number of flight hours. According to the 2005 Nall Report, there were 7.2 GA accidents and 1.39 GA fatalities per 100,000 flight hours. But auto accident rates are based on miles, not hours. The current rate for motor vehicles is 1.3 deaths per 100 million vehicle miles. What if GA accident stats were presented in miles instead? Using an average aircraft speed of 130 mph and the estimated 23.1 million flight hours reported by the FAA in 2005, you'd be left with 16.3 deaths per 100 million vehicle miles. Not quite as flattering, huh?
Am I claiming that flying is dangerous? Of course not. We can look at yet another statistic, the chance of dying in a GA accident in 2005, which was about one in 613,000 while your chance on the road was one in 7,700. Which brings up the next point. If statistics don't always tell the full story, as we've seen so far, how is a particular aircraft deemed "safe" or "unsafe"? Take, for example, the Cirrus. It's gotten a bad rap from various media outlets while the company claims that it's one of the safest airplanes in the sky. I've flown one, and it didn't seem to be ornery at all. In fact, it flew better than many other airplanes I've flown. So what gives?
Perhaps the unfair comparison issue is cropping up yet again in the search for statistical analysis. While a discussion about airline fatality stats versus those in GA does come up, everyone realizes they're two different ball games. Yet, is it fair to compare accident rates of aircraft that are designed to "go places" with those that mostly stick to training or even those that crop dust? The point is, to truly determine how safe an airplane is, you've got to compare it to aircraft doing the same kind of flying. Most people don't want to purchase a more than 200-horsepower, 150-knot aircraft to do pattern work. They'd be more likely to travel, take the family on vacation or go on business trips.
Needless to say, the chances of getting into icing conditions, flying from VMC into IMC or falling victim to controlled flight into terrain are less likely in the pattern or practice area than while traveling cross-country. The point is, to really get a good idea about whether or not an aircraft is safe, it's not enough just to look at raw GA numbers: You must consider the types and amounts of exposures a particular aircraft experiences. Even the Nall Report admits, "Meaningful comparisons are based on equal exposure to risk. However, this alone does not determine total risk. Experience, proficiency, equipment and flight conditions all have a safety impact. To compare different airplanes, pilots, types of operations, etc., we must first 'level the playing field' in terms of exposure to risk."
There are several "traveling" aircraft—those primarily used to get places rather than around the pattern—that can be used as examples. The Cirrus (SR20 and 22), Bonanza B36, Columbia (300, 350 and 400) and Mooney M20R fit the bill. One way to compare aircraft is to look at how many accidents have occurred in relation to the number of aircraft built. The approximate accidents (from the NTSB database) per aircraft built are: Cirrus, 0.025; Bonanza, 0.087; Columbia, 0.005; Mooney, 0.037. While this data is interesting, it provides an incomplete picture.
What's missing? First, each aircraft type has flown a different amount of total hours. Of each hour flown, individual operators may be conducting their flights under different scenarios, such as for business as opposed to for pleasure. Are certain airplanes more likely to encounter IMC? The aforementioned stats also neglect the fact that some aircraft have been taking to the air for many years while some are relatively new. It's obvious that an older fleet is likely to cause pilots more problems than one coming off the factory line.
To make yet another point, how about we compare the "traveling" class with aircraft that don't make a habit of flying coast-to-coast? The Piper PA28R, Cessna 182 and Diamond DA40 seem to fit this model. The approximate accidents per aircraft built are: Piper, 0.220; Cessna 182 (all years), 0.250; Cessna 182 (1997–2006), 0.026; Diamond, 0.009. I think these stats make a pretty good case that it's critical to take such numbers with a grain of salt. The Piper and Cessna mentioned have had significantly higher exposure levels in terms of time and flight hours than the Diamond. Also note how the same Cessna model has different stats depending on the length of exposure (how many years they've been in service). So are Cessna 182s and Piper Arrows less safe than Diamond DA40s? Regardless of what those numbers look like, we can't answer the question with the data on hand, i.e., we need more information.
So what does all this mean? Probably the most important point that can be taken away from this discussion is that airplanes are used differently; therefore, comparing one type to another or perhaps to generic GA statistics isn't fair or useful practice. Certain airplanes give pilots the ability to fly at high altitudes, which increases the risk of unfavorable weather year-round. Considering that most GA pilots have little experience with high-altitude flight and its associated conditions, this type of exposure is of particular concern.
Many of the "traveling" aircraft allow pilots to fly at higher speeds. This requires increased situational awareness and planning, both of which have long been weak spots among pilots in general. Additionally, higher speeds increase the likelihood that the aircraft will actually be used to go places. And when pilots venture outside of the geography with which they're most comfortable, they are exposed to unfamiliar weather systems and conditions. And don't forget about the possibility of flying in and over exotic terrain, again creating a different kind of flight exposure.
And since "traveling" aircraft can actually cruise at decent speeds, they're useful business and recreational travel tools. While this is usually a great thing, business meetings and vacation plans can often augment "get-there-itis" risks. Conceivably, "traveling" pilots could have a different type of philosophy on flying. Some may equate particular types of avionics or a particular horsepower to a certain level of invincibility. A pilot may be more likely to conduct a flight to a destination with lower minimums if he or she has a glass cockpit and autopilot. Or one may be more easily convinced to take a shortcut over mountainous terrain knowing that the aircraft's service ceiling exceeds the highest peak. Truthfully, a close look at NTSB reports on accidents involving these aircraft types yields an interesting conclusion—pilots are still the primary cause of accidents.
Speaking of pilots, to make a fair statistical analysis, we must look at the types of pilots flying each make and model. According to Columbia aircraft, at least 80% of their customers fly more than 150 hours a year as compared to the 75% of all pilots who fly less than 150 hours per year. Another interesting stat is that 85% of Columbia owners have an instrument rating, but of pilots in general, only 43% can boast an instrument ticket. Do these facts have anything to do with Columbia's excellent record? Probably. It just goes to show how important all the factors that are typically ignored in reports are key to making a reasonable determination about safety of flying and of individual aircraft.
Writer Rex Stout once said, "There are two kinds of statistics, the kind you look up and the kind you make up." Hopefully this analysis has opened up your eyes so you can more readily spot suspicious data. So the next time someone claims that aircraft A is unsafe or flying under certain conditions is safe, take a hard look at the data. Was the comparison made between two peers under similar circumstances? Are there any factors that weren't taken into consideration? If you can take just one thing away from this article, be sure that you don't always believe the hype.