Drivers interested in accessing Tesla’s Full Self-Driving beta will now have to submit images and videos of themselves to the company in the event of a collision or other serious safety event. That’s according to new warning language from the company, first noted by Electrek, which requires FSD beta drivers to consent to provide the data in the event of an FSD related crash.
Here’s the message some Tesla drivers saw: “By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.”
Though some Tesla vehicles already use the front-facing camera for driver monitoring, the inclusion of the “VIN-associated image data” language potentially means Tesla can link the footage with an individual driver, Electrek notes. Gizmodo reached out to Tesla for comment regarding the report but hasn’t heard back.
Tesla’s new FSD language comes less than two weeks after what may have been the first major crash involving FSD. In that case, a Model Y vehicle in FSD was severely damaged after allegedly turning into the wrong lane. Not long before that, Tesa issued a recall affecting 11,704 vehicles over an FSD software glitch causing vehicles to unexpectedly engage the brakes. Tesla remedied that software malfunction with an over-the-air update.
Tesla’s in-cabin monitoring system speaks to a larger, emerging issue around the tradeoffs between driver safety and personal privacy, especially as the auto industry moves to integrate more autonomous-ish features. The EV maker began rolling out in-cabin driver monitoring systems in its Model 3 and Model Y vehicles back in March following an increase in reports claiming the company wasn’t doing enough to ensure Autopilot users were keeping their hands on the steering wheel. Tesla had previously relied on steering wheel inputs to determine if a driver’s hands were on the wheel, but that was easily spoofed by drivers and safety groups who were able to trick the system with simple everyday objects like water bottles and tape.
Though other carmakers including BMW, Ford, and GM also utilize driver monitoring systems, they claim they use closed-loop systems that use infrared technology to monitor eye positions and head movements. Tesla’s decision to instead hold onto cabin image data and analyze it after a safety event has drawn concerns from some privacy experts, like Electronic Privacy Information Center legal counsel John Davisson, who told Consumer Reports he worries the data could be used by law enforcement or other groups down the line.
“Any time a video is being recorded, it can be accessed later,” Davisson said. “There may be legal protections around who can access it and how, but there’s always the possibility that insurance companies, police, regulators, and other parties in accidents will be able to obtain that data.”
Driver monitoring systems, in some form, are coming and will become more common. The European New Car Assessment Programme will reportedly require a driver monitoring system in their safety programs as early as 2023. The U.S., meanwhile, may mandate carmakers to install some form of drunk driving detection technology in new cars by as early as 2026. (The EU is also reportedly looking into ways to require carmakers to detect impaired drivers).
What remains to be seen, though, is whether companies will follow Tesla’s data-collection route, or opt for more limited alternatives. The issue of whether drivers can meaningfully consent to this type of collection could also grow more complicated, especially in the U.S. where states are beginning to adopt their own, sometimes divergent data privacy laws.