How the Media Screwed Up the Fatal Tesla Accident

Joshua Brown died tragically while his Tesla was on Autopilot. But the media’s response shows how little it understands about technology.
This image may contain Transportation Vehicle Truck Automobile Car Helmet Clothing Apparel Human Person and Machine
A Tesla Model S in autopilot mode.From Bloomberg/Getty Images.

Saturday, May 7, was yet another tragically fatal day on American roadways. A woman in Chicago ran a red light, killing one person and sending six others to the emergency room. On Florida’s east coast, not far from Jacksonville, a car flipped on I-95 and killed four more people. And then there was another fatal crash in northeastern Pennsylvania, near Scranton, in which three people died. These were just among the most notable accidents. Throughout that day, hundreds of other people were killed or seriously injured in their cars across the country. Car crashes, which kill nearly 33,000 people a year in America alone, have become a normative statistic. Perhaps the only accident you will recall from that Saturday was the untimely death of Joshua Brown, who lost his life when a tractor-trailer turned in front of his Tesla Model S, which was driving using the autonomous cruise control called Autopilot.

Since the news broke about this accident, the story has been picked up in thousands of outlets—including this one—around the world, many of which raised significant concerns for Tesla’s Autopilot feature and the future of driverless cars. The New York Times ran a front-page story on Brown, quoting experts who said that the accident was “a wake-up call” for the rapidly burgeoning self-driving industry—an incident, in fact, that should force us to “reassess” driverless cars. Fortune scathed Elon Musk and Tesla for misleading shareholders and not sharing the crash news sooner. And a local Florida ABC News affiliate said the crash was “raising safety concerns for everyone in Florida.” (Yes, the newscaster literally said “everyone.”)

Sure, these are all fair points, and Brown’s death is clearly a tragedy, but perhaps the media is looking at this all wrong. According to the National Highway Traffic Safety Administration, in 2014, the last recorded year for these statistics, one person dies for every 100 million miles driven in the United States. According to Tesla, however, the accident on May 7 was the first fatality in 130 million miles of driving on Autopilot. So your odds of remaining safe on the road are actually better in an autonomous vehicle than with a human behind the wheel.

What about globally? Well, those numbers are even more frightening. The Association for Safe International Road Travel notes that nearly 1.3 million people die a year in car crashes around the globe. On average, that’s 3,287 deaths a day. Which means that there is a fatality every 60 million miles. (Presumably this has something to do with the millions of people in China hitting the road for the first time—ever.) On top of that, an additional 20 million to 50 million people are injured or disabled as a result of auto accidents annually. Yet after a single accident on Autopilot, people are already talking about banning driverless cars.

It doesn’t matter that the technology in question was in beta, and that it informed the driver to keep his hands on the wheel with a slew of warnings. “Many unforeseen circumstances can impair the operation of Traffic-Aware Cruise Control,” the car manual cautions drivers. “Always drive attentively and be prepared to take immediate action.” And let’s not forget that Brown was allegedly watching a movie while he was in the car. Though the immediate response is to lambast the technology, because humans are just so much better at operating heavy machinery.

I’m not usually one to apologize for Silicon Valley—an industry that has helped give rise to Donald Trump and ISIS and is changing the world, largely, through the world’s most prodigious portfolio of go-fetch companies. But the Valley has it right on autonomous vehicles. Technology, after all, scales the learning curve. All of the Teslas out there, driving around in Autopilot, collectively learn from their mistakes; subsequently, and collectively, the technology keeps getting better as the software that runs a Tesla shares what it has learned with other Teslas. Humans, however, all have to learn from our mistakes individually. Until then, maybe Tesla should rename Autopilot a less-scary-seeming “Look! No hands!” moniker. Cruise control went through several naming iterations, including “controlomatic,” “speedostat,” when it was introduced during the 50s.

Brown’s death is undoubtedly a tragedy. But it will just as surely make our roads far safer on account of what the industry learns from the accident. During the past years, I’ve interviewed dozens of people about autonomous vehicles, and the indisputable consensus is that the goal of these vehicles is to save lives. The Department of Transportation notes that 94 percent of car accidents are caused by human error. In theory, driverless cars could save more than 30,000 lives annually in America alone. Globally, that number is over a million. Perhaps the current news cycle can be summed up in a “Car Throttle” Tesla Autopilot review last year, when a passenger speeding down the freeway said, “I actually think there are a lot of crap drivers out there, and I think this is probably better than a lot of people.”

Statistically speaking, it seems the man was right.