Though it seems clear Tesla CEO Elon Musk would rather spend most of his days attempting to weasel his way out of a disastrous Twitter deal, renewed threats of an Autopilot recall from federal regulators may force the billionaire to re-focus on his day job.
This week, the National Highway Safety Administration ( A.K.A Musk’s sworn enemy) announced it’s significantly expanding its investigation into Tesla’s Autopilot driver assistance feature.
NHTSA launched its investigation last year following around a dozen accounts of Autopilot engaged Teslas reportedly hurling themselves at first responder vehicles. Since then, the NHTSA said it has added six additional crashes to its investigation. The widened probe now covers 830,000 Tesla Model Y, X, S, and 3 vehicles sold between 2014 and 2021. The regulators claim the Tesla’s being investigated have resulted in 16 crashes that have left 15 people injured and one dead.
In a statement sent to Gizmodo, an NHTSA spokesperson confirmed it was upping its investigation from a “preliminary evaluation” of Autopilot to an “engineering analysis.” That new status will let regulators extend crash analyses, perform vehicle evaluations, and, “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.” The engineering analysis stage, The Washington Post notes, is typically the final stage of an investigation before the NHTSA would issue a recall.
The document claims Tesla’s Autopilot feature issued forward collision alerts in a majority of the 16 crashes. Tesla’s Automated braking engaged in around half of the cases. On average, the document claims Autopilot “aborted vehicle control,” less than one second prior to impact.
In their statement, the NHTSA spokesperson made a point that, “no commercially available motor vehicles today are capable of driving themselves,” a distinction Musk has previously struggled to make clear.
“Every available vehicle requires the human driver to be in control at all times, and all State laws hold the human driver responsible for operation of their vehicle,” the spokesperson added. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”
Gizmodo reached out to Tesla for comment but hasn’t heard back.
Tesla has already faced recalls over other aspects of its self-driving tech. Last November the company was forced to recall 11,704 vehicles due to a software glitch that could reportedly result in “false forward-collision warnings’’ or automatic braking. More recently, Tesla had to recall 53,822 vehicles equipped with its Full Self Driving beta over concerns that its new driver profile settings were encouraging vehicles to engage in illegal rolling stops at intersections.
All of that negative attention has resulted in a plateauing effect on consumer confidence in automated driving systems. A poll conducted earlier this year found a solid majority (63%) of U.S. adults said they would not want to ride in a driverless vehicle if they had the opportunity. Worse still, 44% of adults said they thought the widespread deployment of driverless vehicles would be a net bad for society, compared to 26% who thought they’d be good. Safety remains a paramount issue. According to a separate April survey conducted by Morning Consult, just 19% of U.S. adults said they believed driverless vehicles were safer than traditional cars. That’s an 8% dip from the 27% who said they thought driverless technology was safer in 2018.