Until the 1980s the U.S. had the lowest automotive death rates of all industrialized countries, but it has now dropped near the bottom of the rankings, as we’ve reported before. That’s because traffic fatalities have declined further in the other countries, making the U.S. an outlier. If we had matched the far-from-perfect safety performance of a number of those other countries, it’s estimated that 10,000 to 20,000 fewer lives would be lost annually on American roads. To make the story even sadder, after years of declining U.S. traffic fatalities, the number ticked up in 2015 and 2016, in part because of an increase in distracted driving.
The key to reducing traffic deaths is to eliminate reckless driving behavior and driver errors. In those matters, many industrialized countries have done a better job than the U.S.—for instance, in cracking down on drunk driving and speeding and in encouraging people to wear seatbelts.
One way to improve driving would be to have it done largely or even totally by computers. Cars that drive themselves some or all of the time—technically known as highly automated or autonomous vehicles—are likely to become increasingly common in the next decade. Not surprisingly, people tend to be skeptical or afraid of putting themselves at the mercy of computers behind their dashboard, even though they rarely worry about the sizable risk they take joining other all-too-fallible human drivers on the road. This was made clear recently by nationwide headlines about a woman in Arizona who was killed by a self-driving car. That same day, about 100 other Americans were killed in crashes involving regular vehicles, many of which were the result of human errors—but such crashes rarely make the news.
How perfect do they need to be?
According to a new report from the nonprofit RAND Corporation, the U.S., in particular, has a lot to gain from self-driving cars. It concluded that such vehicles should only have to be moderately better than human drivers before being widely used—that is, we shouldn’t wait until the technology is perfected. Based on numerous ways of calculating likely outcomes, the researchers found that wide use of autonomous vehicles when they are just, say, 10 percent better than current American drivers would prevent thousands of U.S. fatalities during the first decade, and possibly save half a million lives over 30 years, compared to waiting until they are 75 percent or 90 percent better.
“If we wait until these vehicles are nearly perfect, our research suggests the cost will be many thousands of needless vehicle crash deaths caused by human mistakes,” one of the authors noted. “It’s the very definition of perfect being the enemy of good.”
Developers are testing self-driving cars in various cities, while lawmakers are considering new regulations to govern their use. What remains unknown is how good the vehicles will have to be before they are made widely available. Of course, even when self-driving cars are proven much safer than average human drivers, they will still get into crashes, and people may be less tolerant of mishaps involving computers than of those caused by humans. “But if we can accept that early self-driving cars will make some mistakes—but fewer than human drivers—developers can use early deployment to more rapidly improve self-driving technology, even as their vehicles save lives,” one author noted.
Also see Reducing Traffic Deaths.