Two Big Questions for Elon Musk

Elon Musk, CEO of Tesla Motors Inc., introduces the Model X car at the company's headquarters in Fremont, Calif.

Elon Musk, CEO of Tesla Motors Inc., introduces the Model X car at the company's headquarters in Fremont, Calif. Marcio Jose Sanchez/AP File Photo

The Tesla CEO says it would be “morally reprehensible” for his company to build a fully driverless car before introducing semi-autonomous safety features. But how do we know Autopilot is safer than the alternative?

There are two main approaches to building a self-driving car in 2016: Either you design a vehicle to be totally and completely autonomous, so no human input is needed. Or, you design a vehicle that very much requires a human behind the wheel and ready to drive, but over time—with the addition of piecemeal driver-assisted technologies—becomes a fully autonomous car.

Google’s going with the first approach. Tesla has opted for the second. Elon Musk, Tesla’s founder, makes a good argument for this incremental approach to driverless cars in his new master plan for the company.

I should add a note here to explain why Tesla is deploying partial autonomy now, rather than waiting until some point in the future. The most important reason is that, when used correctly, it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability.

It seems like a no-brainer: Why would any company that has the technology to save lives now wait to make that technology available to the public? Swap in now-standard technologies like antilock brakes, airbags, and power steering, for example, and it’s hard to argue any other way.

But there are two big problems with Musk’s explanation.  

For one, how does Tesla quantify Autopilot’s safety record compared with an ordinary car? Musk says in a car without Autopilot, there is on average one death for every 89 million miles traveled; but in a car with Autopilot, the first death won’t happen for more than twice as many miles.

It’s not clear, however, how he reaches this conclusion. Academics like Missy Cummings, a roboticist at Duke, and John Leonard, an engineer at MIT, have long called for more transparency from the manufacturers working in the self-driving car space. (Tesla didn’t immediately respond to a request for clarification Thursday morning.)

» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.

The second problem with the way Musk talks about Autopilot is a much broader messaging problem. In its enthusiasm to stake a claim in the self-driving car space, Tesla has arguably given the impression Autopilot is more autonomous than it actually is.

To be fair, the company has been very careful about underscoring the technology’s limitations. Notice that, even in Musk’s bold statement about his company’s moral obligation to the public, he highlights that Autopilot is safer “when used correctly.”

(As a side note, this is why it’s so important for Tesla to be transparent about the data behind its safety claims: Otherwise “when used correctly” becomes an unlimited get-out-of-jail-free card, because anything can be dismissed as incorrect usage.)

Tesla was similarly careful in the wording of a statement it released after a driver using Autopilot died in a collision with a tractor trailer in May.

“It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled,” Tesla wrote at the time. “Additionally, every time that Autopilot is engaged, the car reminds the driver to ‘Always keep your hands on the wheel. Be prepared to take over at any time.’”

It’s still unknown whether the driver who was killed was following those instructions, but it’s obvious many Tesla drivers treat Autopilot as more autonomous than it actually is. There’s proof of this all over YouTube, where several Tesla drivers have uploaded videos of themselves pushing (and in some cases blowing past) the limits of how Tesla says Autopilot should be used.

“They advise you keep your hands on the steering wheel when using the auto-steer, but as we’re testing, you really don’t need to,” says one man in a video that’s had nearly 2 million hits on YouTube. “No hands, no feet, and I’m not nervous at all, really,” he adds, while cruising at 75 miles per hour on a highway.

In Tesla’s new master plan, Musk made it clear the company has no plans to pivot from its incremental approach to autonomy.

“It would no more make sense to disable Tesla's Autopilot, as some have called for, than it would to disable autopilot in aircraft, after which our system is named,” he wrote.

But by continuing to talk about Autopilot as a facet of eventual driverlessness, Musk may be undermining the very safety he’s promising to the public.