His consistent use of computer programming jargon reveals a lot about his leadership approach.
Mark Zuckerberg’s social awkwardness is readily apparent to anyone who has heard him speak. Yet to his credit, the Facebook founder and CEO has been willing to engage in candid interviews, speaking openly with CNN and Wired in the aftermath of the Cambridge Analytica scandal and, earlier this week, diving headfirst into a round of hard questions from Recode’s Kara Swisher.
These kinds of interviews are a risk for high-profile people like Zuck. Stripped of their well-oiled PR machines, they may provide the world with a dangerous glimpse into the way they actually think. The Swisher interview did expose something important about the strange way Zuckerberg sees the world—not because of what he said, but the way he said it.
Most revealing is his consistent use of computer programming jargon to describe the human suffering that his platform has frequently facilitated. In spite of all the soul-searching Facebook and Zuckerberg have been forced to do in recent years, his instinct is still to see people as if they were data points.
This jargon comes from the computerati, centered around the mecca that is Silicon Valley. They have their own distinctive way of talking, a shared vocabulary helps them think through and describe technical challenges and possible solutions.
One example is the phrase “use case,” ubiquitous among designers and programmers. “Use case” is a written description of concrete ways that people interact with a given technology. So in the case of Facebook, people using status updates to share movie recommendations would be a use case. So would using updates to rant about politics or post memes.
In the Recode interview, Zuckerberg falls back on the term “use case” to describe people using Facebook Live to stream their own suicides in real time. He repeats this characterization, going on to call suicide-streaming a “use” of Facebook Live: “There were a small number of uses of this, but people were using it to…show themselves self-harm or there were even a few cases of suicide.”
Technically, he is right of course. A “use case” is defined as having three parts: “actors,” “system,” and “goals.” In this case, all three requirements are satisfied. The actor is a Facebook user. The system is Facebook Live. The goal is public suicide. Ah, yes, a “use case.”
But your first instinct when thinking about someone broadcasting their self-inflicted death to the world—on a platform you created—should not be to consider it in terms of an actor using a system to achieve a goal. It is, quite simply, a “tragedy.” Calling live-streaming suicide a “use case” is a way of talking about a terrible thing without confronting its emotional content. The language makes the act seem equivalent to the many other uses of Facebook Live, like showing people what funny thing your dog is doing.
Then there’s the term “spin up”—another programmer favorite. Nothing satisfies a coder like “spinning” something “up.”
It is a phrase that comes from old hard disk drives, the ones that had to physically accelerate to a certain number of revolutions per minute in order to read and write data effectively. Now, in tech speak, “spin up” has become a generic phrase for “get something going.” Being able to easily spin something up is a great thing. If you’re going to create a new website or set up a database, it should have infrastructure in place to make it simple to “spin up.”
Swisher asked Zuckerberg what he personally did about family separations at the U.S. border, an issue he deemed “terrible.” He said he didn’t do much because “when a crisis comes up, you can’t just spin this stuff up immediately.” Let’s remember that that which is being spun in this case is not a magnetic disc but a way to spare children from trauma and suffering. Slow spin-up time might be a good reason to avoid creating a new database. It is a bad excuse for not helping lonely kids.
Both of these terms offer Zuckerberg a way to talk about human suffering while removing the humans from the equation, treating them instead as technical abstractions. People are “users,” and their problems are “bugs.” It’s no wonder, then, that Zuckerberg failed to anticipate that people would use Facebook Live to commit suicide, or that he didn’t think it would be a big deal to say Holocaust deniers are just people who “get things wrong.” Zuckerberg can’t predict the ways that his platform will impact people because his way of thinking doesn’t incorporate humans at all.
If Zuck were just an engineer at Facebook, or even CTO, such cold language would be forgivable. But he is the political and moral conscience of an organization that is deeply embedded in the lives of billions. And until he starts talking about people as people—not actors using a system to reach a goal—we should not have confidence that he will do right by us.