The First Cross-Country, Autonomous-Car Trip Is an Ethical Minefield

Gif: Gizmodo (

Anthony Levandowski—formerly of Google, where he reportedly contributed to at least one serious crash and did not alert the police while overseeing its self-driving car program, and of Uber, where he allegedly imported stolen secrets from Google and was fired after being sued over it—rides again.


Levandowski claims to have taken the first ever hands-free cross-country highway road trip. A souped up Prius took him from San Francisco to New York, mostly on the Interstate 80, using his newest proprietary self-driving vehicle software, Copilot, the upcoming product from his new company, Pronto. (It will be aimed at the long haul truck market.) He claims to have only touched the wheel during planned stops, for gas and resting up. The story, first reported by the Guardian, is raising all kinds of questions, too few of which are: “why is anyone letting Anthony Levandowski do another autonomous vehicle company?”

It’s a technologically impressive achievement, if it’s true (though a number of experts quoted by the Guardian are very skeptical, which, given Levandowski’s past, should be no surprise). Levandowski reportedly made “multiple attempts,” getting as far as Utah more than once before being forced into a “disengagement,” and having to put his hands on the wheel to avoid a collision or to change lanes, and then starting over. Levandowski was in the driver’s seat the whole time, but said of his apparently successful trip, “If there was nobody in the car, it would have worked.”

It’s also a case of a Silicon Valley figure, whose copious risk-taking is glorified as bold, receiving what seems like an infinite supply of chances, both in the tech press and with industry partners. (The Guardian story is fairly critical, but most of the subsequent coverage has been focused on the technical achievement.)

Remember, Levandowski was instrumental in instilling a culture of recklessness, not just in Google, where he worked—and where safety has since become a more predominant concern—but in the “race to be first” across the nascent autonomous vehicle industry. Levandowski reportedly ignored Google’s own rules about which streets were safe to test self-driving cars on, and caused an accident with another executive in the car. Then, he started his own company, Otto, which he stands accused of building with stolen Google company secrets, and was bought up by Uber, where he was fired in 2017.

Now, as he says in the blog post announcing his new venture, he’s back. “The ‘why’ is fairly simple,” he writes, “I’m back because it’s my life’s passion to make the life-saving potential of autonomous vehicles a reality.” Given his past record—the man is so passionate about the prospect of saving lives with autonomous vehicles that he has apparently been willing to endanger them in serious car crashes—it is difficult to believe that Levandowski’s mad dash to build self-driving car tech is motivated by deep public health concerns. This—a stunt, impressive or no, with himself in the cockpit—seems like more of a push to prove he’s still a relevant force in the field, and does little to demonstrate he’s learned anything from his much-criticized previous behavior. (His car was pulled over in Utah, but there were no reported accidents.)

For his part, Levandowski writes me that he designated the software ‘Copilot’ for a reason, as opposed to something that indicated it would be an autonomous system—raising, perhaps, the question of why he chose to promote it with a purportedly autonomous cross-country trek.


“As mentioned at the end of the video, in our blog post, and our website, our current level of technology is a ‘Level 2’ driver assist system,” he wrote me in an email through a press contact. “It cannot and will not work unless it is only assisting fully alert drivers, as the ‘copilot’ branding is meant to convey (each vehicle driver is still very much the ‘pilot’ of their vehicle). The Copilot product brings to the trucking market the features that have previously been the exclusive domain of luxury passenger vehicles. Over time, our superior software will enable us to begin work on level 4 systems in a much safer and more scaleable way, but we are not there yet today. We believe that clear messaging of what the technology can and cannot do, including by the media, is a critical component of road safety as self driving technology is rolled out.”


However, taking a cross-country stunt test drive with a car piloted by new software may not be the best way to establish that you are a newly chastened and trustworthy entity. Here’s Levandowski, again hurling himself headlong across state lines in a trip that could raise legal questions—self-driving test rules vary state-to-state, and Levandowski had to pass through a lot of states from California to New York. (Levandowski declined to answer a question about what considerations were made into the legality of his trip.)

Now, after my last story about the unnecessary recklessness of the self-driving car industry, I got some responses along the lines of, well, human-based driving is very dangerous right now, and isn’t this better? And the whole point is: It absolutely can be. Self-driving cars could be so much safer than those operated by dumb, twitchy, tired, occasionally drunk human drivers, so let’s maybe ask from the outset that the people building them not bake recklessness and opacity into their culture and systems. Levandowski has anything but shown he’s up to that task.



When did self-driving cars become a thing? I have never once thought about it and, once hearing about it, not really interested. I even looked up online for polls or whatever, and found this article that confirms that it is just happening because it can: