An alarming share of drivers using partially automated driver assistance features overestimate their vehicles’ supposed self-driving capabilities. Around one in four Tesla Autopilot drivers and an even higher 53% of Cadillac Super Cruise users surveyed in a new report said they feel “comfortable” treating their vehicles as full-self driving even as those systems technically fall far short of meeting that characterization.
The survey results, collected by the Insurance Institute for Highway Safety, provides new insights into the ways drivers using advanced driver assist (ADA) features actually perceive the technology. Drivers using the ADA systems, in general, were more likely than other drivers to perform non-driving activities like eating or texting while on the road. Tesla, Cadillac, Nissan, and other car makers with popular ADA tools typically deploy sensors on the steering wheel or cameras in the cabin to check if drivers are paying attention to roads, but these tools have their limits.
Super Cruise and Autopilot, unlike Nissan’s ProPILOT Assist, notably include lockout systems to restrict drivers from accessing ADA features if they’ve been found to continuously violate fail safes. The report claims around 40% of Cadillac and Tesla drivers surveyed said they’ve been locked out of those ADA systems at some point while driving. Those high figures, the report argues, appear to suggest drivers regularly failed to respond to warnings and attention reminders.
“None of the current systems is designed to replace a human driver or to make it safe for a driver to perform other activities that take their focus away from the road,” the IIHS wrote. “Track tests and real-world crashes have provided ample evidence that today’s partial automation systems struggle to recognize and react to many common driving situations and road features.”
Super Cruise and Autopilot drivers were more likely than Nissan ProPILOT Assist drivers to engage in activities that took their hands off the steering wheel. Both the Cadillac and Tesla drivers were also more likely than the Nissan drivers to say they feel like they can do non-driving activities better and more often while using the driver assist tools.
The IIHS says those divergences of attitudes between brands could stem from design and marketing strategies. The paper points to Cadillac Super Bowl ads depicting drivers with their hands on their laps, and Tesla’s commercial airline inspired “Autopilot” branding influences potentially capable of shifting driver attitudes and behavior.
“These results from frequent users of three different partial automation systems once again drive home the need for robust, multifaceted safeguards,” IIHS Research Scientist and study lead author Alexandra Mueller said. “Many of these drivers said they had experiences where they had to suddenly take over the driving because the automation did something unexpected, sometimes while they were doing something they were not supposed to.”
In a statement sent to Gizmodo, General Motors, which own Cadillac, responded to the IIHS report saying it believes proper driver engagement is, “critical and required” for any vehicles it sells. GM went on to highlight features of its Driver Attention System, which monitors a driver’s head position and gaze in relation to the road.
“When the system detects the driver isn’t paying attention, a series of escalations will prompt the driver to reengage,” GM said. “When using Super Cruise, the driver is responsible for operating the vehicle in a safe manner and must remain attentive to traffic, surroundings, and road conditions at all times.”
Tesla did not respond to Gizmodo’s request for comment.
For years, critics have worried the over-ambitious branding and presentation of partially automated vehicles from carmakers and, in one case, their eccentric CEO’s, have contributed to some of the overconfidence drivers ascribed to the systems.
The obvious elephant in the room here is Tesla CEO Elon Musk. Musk has repeatedly tried to convince consumers Level 5 automation (seen as the gold standard for true self-driving) is “just around the corner,” since at least 2019. That, ahem, overly optimistic outlook, critics fear, may have contributed to drivers’ current confidence in supposedly driverless systems. Though Musk’s since slightly backtracked on that rhetoric, he and Tesla nonetheless went on to call their newer ADA iteration “Full-Self-Driving,” even though it’s only technically capable of achieving level 2 autonomy. That does not meet the standard requirement for a supposedly self-driving vehicle.
Numerous videos over the years show Tesla drivers riding in the backseat, sleeping, or even having sex while their vehicles barrel through roadways. In some cases, those ADA engaged vehicles are getting in wrecks. According to first of its kind data released by the National Highway Traffic Safety Administration earlier this year, the regulator reportedly received 392 reported incidents of crashes involving Level 2 advanced driver assistance technology which resulted in at least six deaths and five serious injuries. 273 of those crashes, a vast majority, came from Teslas.
In an open letter last summer, Democratic Senators Richard Blumenthal and Ed Markey urged the Federal Trade Commission to investigate Tesla’s self-driving marketing scheme for signs of “potentially deceptive and unfair practices.” The senators argued Musk and Tesla, “repeatedly overstated the capabilities of its vehicles,” which they say could potentially pose threats to drivers and other motorists sharing roads with them. More recently, the California Department of Motor Vehicles released a pair of fillings accusing Tesla specifically of engaging in deceptive advertising around its branding of its Autopilot and Full Self-Driving features. Not longer after, the California State Senate passed new legislation its authors believe could prevent Tesla from branding its vehicles as self-driving.
Updated Oct. 11, 2022, 1:04 p.m: Added statement from General Motors.