The Future Is Here
We may earn a commission from links on this page

There Is Absolutely No Reason to Trust the Safety Record of Tesla’s Autopilot System

We may earn a commission from links on this page.
A Tesla Model S that crashed in South Jordan, Utah, while in Autopilot mode accelerated in the seconds before it smashed into the stopped firetruck, according to a police report obtained by The Associated Press. Two people were injured.
A Tesla Model S that crashed in South Jordan, Utah, while in Autopilot mode accelerated in the seconds before it smashed into the stopped firetruck, according to a police report obtained by The Associated Press. Two people were injured.
Photo: South Jordan Police Department (AP)

Tesla has long lurked in a category of its own in the self-driving car race; where Uber and Google’s Waymo are building fully autonomous vehicles essentially from the ground up, Elon Musk’s electric car company is slouching towards autonomy through a series of increasingly sophisticated updates to its semi-autonomous Autopilot system. Because Teslas are not totally self-driving, and because they are already on the roads, this puts the company in a sort of grey area—even greyer than the already grey area where standard autonomous vehicles dwell—when it comes to regulation and oversight.

This is a problem. Musk is a nonstop booster and font of optimism for Autopilot’s self-driving capabilities, Musk has millions of diehard devotees and customers, and Autopilot has so far been enabled during at least four fatal Tesla crashes. It’s a volatile and increasingly dangerous situation, especially as Musk continues to vouch for its safety, and make claims like the one about how there will be a million autonomous Teslas on the road *next year*.


Meanwhile, no one really has any good data about how and in what circumstances crashes that involve Autopilot happen; states don’t require that kind of data be collected because Teslas aren’t technically autonomous cars. So investigators have access to only small slivers of said data, and Tesla refuses to share any of its own trove.

All of which is why Matt Drange’s epic investigation into Autopilot’s safety record for the Information should absolutely be making a bigger splash than I’ve at least personally seen it making. Perhaps it’s because it’s hard to get anything to stand out these days that does not rise to the level of intrigue of our increasingly Caligula-esque president being deceived into believing he was not walking past an aircraft carrier with his late political rival’s name on it. Perhaps it’s because it literally costs hundreds of dollars to subscribe to the Information and to read stories that serve the public interest like this one. Who knows.


But while the entire piece is full of good reporting and interesting insights about the many, many challenges regulators and safety officers face in coming to grips with the Autopilot situation, one thing stuck out: Tesla’s refusal to even comment on the record in any official capacity about Autopilot’s safety record. Not only will Tesla apparently not make public the Autopilot data itself, or share it with regulators, it wouldn’t even discuss the numbers with Drange.

Musk has in the past been lauded for his transparency—see: his detailing very specific, elaborate plans to bring Tesla to the mainstream, or open-sourcing tossed-off Hyperloop specs—and has also been chastised for being too transparent. See: his Twitter feed, which is perpetually on the brink of a very public and very expensive train wreck.

So the fact that he will not cough up any of the data about Autopilot feels pretty telling. If it put Tesla in a positive light, there seems to be little question Musk would loose it upon the world. That’s what he does.

Instead, Tesla publishes its own quarterly vehicle safety reports that purport to demonstrate how driving with Autopilot is much safer than driving without it. As Drange notes, “the reports only show a rate of collisions on a per-miles-driven basis and don’t disclose what caused the crash and whether the Tesla driver was at fault.” The experts he cites aren’t buying it either. “It’s obviously a misrepresentation,” Hemant Bhargava, a UC Davis professor of technology management told Drange. “You’re only in autonomous mode in the best scenarios, so the number of crashes will be lower.”


(Tesla wouldn’t share any data with me, either—a spokesperson referred me to the same Vehicle Safety Report.)

As such, there is absolutely no conceivable reason to trust the safety record of Tesla’s Autopilot system—and the stakes are only getting higher. Musk continues to all but encourage users to switch on Autopilot and let the software take over in his public appearances. He remains so full-bore bullish on Autopilot, so deeply convinced of its safety, that it can seem at times that his own staff must have pulled a White House-staff-in-Japan and somehow hidden from his feeds news that four people have died while driving with the software enabled. Because at this point, it’s approaching levels of delusionality.


Until recently, Tesla even sold its cars with the promise that they all came equipped with “full self-driving hardware”—a phrase that was plastered on its website when I wrote about the nascent industry’s recklessness a few months ago. After I argued that promoting the feature might be creating a culture of belief in the system, Tesla’s PR team angrily contested the accusation at length; it was probably one of most contentious conversations I’ve fielded in my entire career. (Now, it looks like they’ve at least adjusted the language on the website.)

But drivers have already absorbed Tesla and Musk’s techno-optimism in these systems, which, again, we have little reason to trust. Before we can, Tesla has to get real about Autopilot—share its collision data with safety investigators, or better yet make it public; tone down the aggressive and unrealistic autonomy rhetoric; be honest about the state of its self-driving ambitions—or a fifth Autopilot fatality is all but inevitable.