Sunday, July 1, 2007
Deciphering Accident Statistics
Digging beyond the numbers for the complete story
Before you blindly believe the studies and the plethora of numbers associated with them, there are some things you should know and think about. It should be no surprise that aircraft manufacturers want to paint their aircraft in the most favorable light possible; after all, they're trying to sell airplanes. And the media seems to chronically harp on the gloom and doom of flying. The point is, we have to be careful about the statistical games being played by people with different motivations. Statistics can and often are manipulated to make things look better or worse than they really are. For example, one study claimed that kids who weighed more were smarter. Wow. Maybe you should let them eat more fast food, right? Well, no. The truth behind this finding is that kids who are older (and, thus, have more years of education) weigh more. You've got to get the whole picture—don't necessarily take things at face value. Such nonsense is probably what prompted British politician Ben Disraeli to note, "There are three kinds of lies: lies, damned lies and statistics."
One big problem with comparing aviation accident statistics to driving accident statistics is that you're really comparing apples to oranges. Aviation accident rates are normally shown as an amount per number of flight hours. According to the 2005 Nall Report, there were 7.2 GA accidents and 1.39 GA fatalities per 100,000 flight hours. But auto accident rates are based on miles, not hours. The current rate for motor vehicles is 1.3 deaths per 100 million vehicle miles. What if GA accident stats were presented in miles instead? Using an average aircraft speed of 130 mph and the estimated 23.1 million flight hours reported by the FAA in 2005, you'd be left with 16.3 deaths per 100 million vehicle miles. Not quite as flattering, huh?
Am I claiming that flying is dangerous? Of course not. We can look at yet another statistic, the chance of dying in a GA accident in 2005, which was about one in 613,000 while your chance on the road was one in 7,700. Which brings up the next point. If statistics don't always tell the full story, as we've seen so far, how is a particular aircraft deemed "safe" or "unsafe"? Take, for example, the Cirrus. It's gotten a bad rap from various media outlets while the company claims that it's one of the safest airplanes in the sky. I've flown one, and it didn't seem to be ornery at all. In fact, it flew better than many other airplanes I've flown. So what gives?
Perhaps the unfair comparison issue is cropping up yet again in the search for statistical analysis. While a discussion about airline fatality stats versus those in GA does come up, everyone realizes they're two different ball games. Yet, is it fair to compare accident rates of aircraft that are designed to "go places" with those that mostly stick to training or even those that crop dust? The point is, to truly determine how safe an airplane is, you've got to compare it to aircraft doing the same kind of flying. Most people don't want to purchase a more than 200-horsepower, 150-knot aircraft to do pattern work. They'd be more likely to travel, take the family on vacation or go on business trips.
Page 1 of 3