The intersection of US 27A and NE 140th Court in Williston, Florida, where the crash occurred

The investigation into a death that occurred while a Tesla Model S driver was using Autopilot has filled the internet with dystopian-sounding headlines. Self-driving car driver died for the first time after crash in Florida. Self-driving Tesla was involved in fatal crash. But this was not a “self-driving car” that killed its “driver.” This was a human driving a semi-autonomous car. And this points to why fully autonomous vehicles are the only types of self-driving cars that make sense on our streets. Ever.

Advertisement

Tesla’s Autopilot is a driver-assist setting that’s very similar to what many other cars on the road currently have, allowing vehicles to use sensors and cameras to automatically steer and adjust speeds. What makes Autopilot so game-changing is the software, which learns from data collected while the human is driving. So in certain conditions, like on highways that aren’t in cities, a Tesla Model S can change lanes and even stop to avoid collisions.

But as Tesla has repeatedly stated, its Autopilot feature is still in beta. It does not magically transform the Tesla into a fully autonomous car. The driver still needs to stay in control at all times, including keeping their hands on the wheel. Tesla’s statement on the crash clarifies this once again:

Advertisement

Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

Tesla drivers know this, but still, it’s pretty damn cool to sit behind the wheel of a car that feels like it’s driving itself. Which is so many videos exist of people filming their Autopilot experiences in apparent wonderment. Joshua Brown, the driver who was killed in the May 7 crash that precipitated the investigation announced today, made many of these videos himself. Here’s one where he documented a close call with a truck that merges into his lane. “Tessy,” as Brown called his car, swerves to avoid the truck.

Brown’s own video offers what’s perhaps the best illustration of what might have gone wrong—in a Tesla, human driving overrides Autopilot.

Advertisement

According to the Levy Journal police blotter, Brown’s Model S traveled beneath an 18-wheeler’s trailer that was making a left turn from a highway intersection with no stoplight. We don’t yet know what happened in either vehicle during the fatal crash—that’s what the National Highway Traffic Safety Administration (NHTSA) investigation wants to find out. (Update: Early reports indicate that the driver may have been watching a DVD.) But you can see why allowing a human to take back control of a car that has already started making a decision to avoid a crash might create a crash instead. And even if there was some kind of sensor blind spot, or a flaw in the software, or just a flat-out failure of the system, requiring the driver’s hands to remain on the steering wheel is problematic. For the system to be foolproof, it should not need—or allow—human intervention as a backup. That’s why transit leaders from 46 cities believe fully autonomous cars are the safest solution.

FHP diagram of the crash

Which highlights another important point: We need data to prove this. Companies that are collecting this kind of crash information need to hand it over to the Department of Transportation. Just a few weeks after the crash happened, but before the investigation was made public, Tesla announced that it was doing just that. Whether the impetus for the announcement was the crash or not, it doesn’t matter much now. But this should absolutely be a requirement for everyone from Google to Uber to start saving more lives on our streets.

Advertisement

Last week a big study explored the ethical dilemmas of how the AI in self-driving cars make decisions, something that Google has talked about at length. But an important point was missed in that study: truly autonomous vehicles don’t exist in a vacuum. If the tractor-trailer had also been fully autonomous—heck, if both cars simply had the very basic connected vehicle tech that the NHTSA is making standard on all cars—the truck would have communicated with the Tesla long before any potential crash. This is the best evidence yet that we need to get our hands off the steering wheels as soon as possible—or get rid of the steering wheels completely.