Self-Driving Cars and the Looming Privacy Apocalypse

Google's self-driving Lexus car drives along street during a demonstration at Google campus in Mountain View, Calif.

Google's self-driving Lexus car drives along street during a demonstration at Google campus in Mountain View, Calif. Tony Avelar/AP File Photo

Driverless vehicles will learn everything about you, and influence your behavior in ways you won’t even realize.

Allow me to join you, if I may, on your morning commute sometime in the indeterminate future.

Here we are, stepping off the curb and into the backseat of a vehicle. As you close the car door behind you, the address of your office—our destination—automatically appears on a screen embedded in the back of a leather panel in front of you. “Good morning,” says the car’s humanoid voice, greeting you by name before turning on NPR for you like it does each day.

You decide you’d like a cup of coffee, and you tell the vehicle so. “Peet’s coffee, half-a-mile away,” it confirms. Peet’s, as it turns out, is a few doors down from Suds Cleaners. The car suggests you pick up your dry cleaning while you’re in the neighborhood. “After work instead,” you say. The car tweaks your evening travel itinerary accordingly.

As we run into Peet’s to grab coffee, the car circles the block. Then, we’re back in the vehicle, en route to your office once again. There’s a lunch special coming up at the vegetarian place you like, the car tells you as we pass the restaurant. With your approval, it makes a reservation for Friday. We ride by a grocery store and a list of sale items appears on the screen. With a few taps, you’ve added them to your existing grocery list. The car is scheduled to pick up and deliver your order this evening.

We’re less than a mile from your office now. Just like every morning, your schedule for the morning—a conference call at 10 a.m., a meeting at 11 a.m.—appears on the screen, along with a reminder that today is a colleague’s birthday.

This is the age of self-driving cars, an era when much of the minutiae of daily life is relegated to a machine. Your commute was pleasant, relaxing, and efficient. Along with promising unprecedented safety on public roadways, driverless cars could make our lives a lot easier—freeing up people’s time and attention to focus on other matters while they’re moving from one place to the next.

But there’s a darker side to all this, too. Let’s rewind and take a closer look at your commute for a minute.

There we were. The car picked us up. We wanted coffee. It suggested Peet’s. But if we’d stopped to look at the map on the screen when this happened, we might have noticed that Peet’s wasn’t actually the most efficient place to stop, nor was it on your list of preferred coffee shops, which the car’s machine-learning algorithm developed over time. Peet’s was, instead, a sponsored destination—not unlike a sponsored search result on Google. The car went ever-so-slightly out of the way to take you there.

Same goes for your dry cleaner’s. The only reason you dropped off your clothes there in the first place was that the car suggested it. And the car suggested it because Suds paid Google, the maker of the self-driving car, to be a featured dry-cleaning destination in your area.

As for the lunch special, that really is a favorite restaurant of yours—but the car has never driven you there before. It knows your preferences because the vehicle has combed through your emails, identified key words, and assessed related messages for emotional tone. Similarly, the car knew which sale items to show you from the grocery store because it reviewed your past shopping activity. Plus, there was that one time you told a friend who was sitting in the car with you how much you liked a particular beer you’d tried the night before. The car heard your conversation, picked up on brand keywords, and knew to suggest the same beer for your shopping list when it went on sale.

In this near-future filled with self-driving cars, the price of convenience is surveillance.

This level of data collection is a natural extension of a driverless car’s functionality. For self-driving cars to work, technologically speaking, an ocean of data has to flow into a lattice of sophisticated sensors. The car has to know where it is, where it’s going, and be able to keep track of every other thing and creature on the road. Self-driving cars will rely on high-tech cameras and ultra-precise GPS data. Which means cars will collect reams of information about the people they drive around—like the data Uber has amassed about its customers’s transportation habits, but down to a level of detail that’s astonishing. The more personalized these vehicles get—or, the more conveniences they offer—the more individual data they’ll incorporate into their services. The future I described might be a ways off, yet, but there’s no reason to believe it’s especially far-fetched.

The companies building self-vehicles have been cagey, so far, about how they’re thinking about using individual data. At a Congressional hearing about driverless cars last week, Senator Ed Markey, a Democrat from Massachusetts, asked repeatedly whether driverless car manufacturers would undertake a minimum standard for consumer privacy protection. No one who was there to testify—including representatives from Google, GM, and the ride-sharing service Lyft—had a clear answer. “You need a minimal standard,” Markey said at one point. “I’m not in a position to comment on that for Google,” said Chris Urmson, the head of Google’s self-driving car project.

Google has avoided this question before, too.

Last June, John M. Simpson, the director of the Privacy Project for the nonprofit advocacy group, Consumer Watchdog, attended Google’s annual shareholder meeting. (Simpson bought two shares of Google stock, he told me, just so he could have the opportunity questions the company’s executives.)

Simpson asked: “Would you be willing to protect driverless car users’s privacy in the future, and commit today to using the information gathered by driverless cars only for operating the vehicle—and not for other purposes such as marketing?”

The executives on the stage glanced at each other for a moment, before David Drummond, a senior vice president and Google’s chief legal counsel, spoke.

“I think it’s pretty early in the game with driverless cars... to have a lot of rules saying, ‘thou shalt not do X, Y, and Z, with the data,’” Drummond said. “I think once we get these operational, the value could be significant... it’s a little early to be drawing conclusions which would, in a lot of ways, reduce innovation and our ability to deliver a great consumer product.”

One approach to protecting privacy could be to anonymize all of the data that self-driving cars collect—making sure specific travel itineraries or details from a given trip aren’t tied to an individual, for example.

But there’s huge potential value to companies who mine individual data and use it for marketing and other services. Self-driving car makers could require an opt-in from consumers before collecting their data—but even that approach is often imperfect. For one thing, self-driving car manufacturers could choose to make opting in a requirement for using the technology at all. And even if individuals are given the choice to opt out of sharing their data—as anyone who has signed a tech platform’s terms of service without reading it knows—terms of service agreements are often lengthy, full of legal jargon, and difficult to parse. Shashua is convinced that Google and its peers have enough incentive to be transparent about how it intends to use passenger data.

“For companies like Google and Uber, privacy issues are very important,” said Amnon Shashua, a co-founder of MobileEye, which makes machine-vision technology for self-driving cars. “That could kill a business, if you don’t handle privacy properly.”

Simpson, from Consumer Watchdog, doesn’t believe that privacy being important means tech giants will do the right thing. “Sometimes it’s just that the people who are designing the gizmo don’t even think in terms of privacy,” he told me. “They just think: more data is always better. In their minds, it’s just, ‘We may not know what we’re going to do with that data.’”

But that’s not good enough, Simpson says. “It’s inappropriate.”