Tesla will play a vital role in an upcoming trial for a 2019 manslaughter case that left two people dead and calls into question the potential hazards of autopilot, or vehicles equipped with advanced driver-assist technology.
Authorities say Kevin George Aziz Riad was driving a Tesla Model S when he crashed into a Honda Civic after running a red light in Gardena, Los Angeles. Gilberto Lopez and Maria Guadalupe Nieves-Lopez were in the car at the time of the crash and both were pronounced dead at the scene. Aziz Riad was charged with manslaughter, but his lawyers claim Tesla’s autopilot feature, which can control the car’s speed, is at fault.
While Tesla has not been charged in Lopez and Nieves-Lopez’s deaths, the trial comes as the company faces public scrutiny for its autopilot system.
“Who’s at fault, man or machine?” Edward Walters, an adjunct professor at the Georgetown University law school who specializes in the law governing so-called self-driving cars told The Guardian. “The state will have a hard time proving the guilt of the human driver because some parts of the task are being handled by Tesla.”
Tesla’s website says its driver assistance systems “require active driver supervision and do not make the vehicle autonomous,” however a study by ScienceDirect revealed drivers are less likely to pay close attention to the ride while the autopilot feature is on.
“I can’t say that the driver was not at fault, but the Tesla system, autopilot, and Tesla spokespeople encourage drivers to be less attentive,” Donald Slavik, an attorney whose firm is representing Lopez’s family told Reuters. He added that Tesla knew there were risks to the system but didn’t manage them properly. “Tesla knows people are going to use autopilot and use it in dangerous situations,” Slavik said.
Tesla is being sued by Gilberto-Lopez’s family and a trial is scheduled for July 2023.
The U.S. Justice Department is looking into Tesla’s role in several crashes that occurred while the autopilot feature was activated. The investigation into Tesla’s autopilot systems could cause problems for prosecutors in Aziz Riad’s trial who claim the crash was caused by his failure to control his speed and his failure to brake. “The DOJ probe helps him because his claim is going to be ‘I relied on their advertising. Therefore, I was not aware of the risk there,” Robert Blecker, a criminal law professor at New York Law School told Reuters.
Despite the autopilot imperfections, Musk said in September that Tesla had a “moral obligation” to roll out the self-driving software and claimed it could save lives. Tesla advises in its owners’ manuals not to rely solely on the autopilot feature and outside factors, such as visibility, can impact the system.
“Never depend on these [Autopilot] components to keep you safe,” Tesla says. “It is the driver’s responsibility to stay alert, drive safely, and be in control of the vehicle at all times.”
In 2021, Musk claimed Tesla’s autopilot system was “approaching 10 times lower chance of accident than average vehicle,” despite the feature being investigated by the National Highway Safety Administration.
The legal implications relating to its autopilot feature may change the perception of Tesla and create difficulties in upcoming lawsuits, University of South Carolina law professor Bryant Walker Smith told Reuters.
“The narrative of Tesla potentially shifts from this innovative tech company doing cool things to this company just mired in legal trouble,” Walker Smith said. “That is the risk, and narrative is very important in civil litigation because both sides tell a jury a story.”
Updated 11/14/2022, 5:40 p.m. ET: This story has been updated with new information regarding a request by prosecutors to delay the trial