Facebook’s Dystopian Definition of ‘Fake’

Monthira/Shutterstock.com

For the social-media platform, a doctored video of Nancy Pelosi is content, not a phony.

Every time another “fake video” makes the rounds, its menace gets rehashed without those discussing it establishing what “fakeness” means in the first place. The latest one came last week, a doctored video of Nancy Pelosi. Unlike so-called deepfakes (machine-learning-made videos in which people appear to say or do things that never actually happened), this video is not technically sophisticated at all. It was altered by slowing down the playback and modifying the soundtrack. The result retains the pitch of Pelosi’s voice but makes it sound as if she is slurring her words, incoherent or drunk.

Many news outlets called it a fake; others called it doctored or distorted. Whatever you want to label it, the video was created to spread, and that’s exactly what happened. The Facebook page Politics WatchDog posted a version that has been viewed millions of times, eliciting sneering comments about Pelosi, possibly from viewers who didn’t realize that the video had been manipulated. Others appeared on Facebook, Twitter, YouTube, and elsewhere. President Donald Trump tweeted a reference to the video; his personal attorney Rudy Giuliani shared it, too, although Giuliani later deleted his post. News outlets have chased the story with fervor, even while correctly noting that such pursuit snares the media in the very trap the makers of the video hoped to set.

These sorts of events are insidious because it’s hard to form a response that isn’t a bad one. Talking about the video just gives its concocted message more oxygen. Ignoring it risks surrendering truth to the ignorant whims of tech companies. The problem is, a business like Facebook doesn’t believe in fakes. For it, a video is real so long as it’s content. And everything is content.

The problem starts when journalists assume that the problem with fakes is obvious. As The Washington Post, CNN, and other outlets covering the video have been careful to note, doctored materials are nothing new, especially online. But those outlets have also gone on to claim that something is novel about videos like this one. At the Post, Drew Harwell wrote that “the outright altering of sound and visuals signals a concerning new step for falsified news,” especially as the 2020 campaigns heat up. At CNN, Donnie O’Sullivan also argued that the situation was unique. It is unprecedented, O’Sullivan claimed, that a “fake video” could be quickly viewed by millions of people, and that official political operatives, such as Giuliani, would promote it.

YouTube removed the video, but Twitter and Facebook did not. Facebook did deprioritize the content, making it appear less often. That step also triggered the site to prompt users before they share it—although those warnings, which read in part, “Before sharing this content, you might want to know that there is additional reporting on this,” might be incomprehensible to an average person. There’s additional reporting on everything these days.

Normally, tech companies don’t offer much in the way of comment about controversies of truth online. But this time, Facebook has gone on the record to explain its decision to retain the video in direct and high-profile ways. The company’s vice president for product policy and counterterrorism, Monika Bickert, spoke with Anderson Cooper the day after it started spreading to explain why Facebook hadn’t removed the doctored Pelosi clip.

Introducing the television segment, Cooper called the video “fake” and “manipulated,” noting that “Facebook knows it’s fake,” since the company decided to make the material less prominent. How then, Cooper asked Bickert, can Facebook claim that it’s committed to fighting fake news while still hosting and amplifying a doctored video?

Bickert’s response is instructive. She clarified that Facebook doesn’t have a policy against misinformation as such. Outside fact-checkers review controversial material like this, she explained, and then “we dramatically reduce the distribution of that content.”

Cooper asked the obvious question: Why keep it up at all once you know it’s false? Absent inviting immediate harm, Bickert explained, “we think it’s important for people to make their own informed choice about what to believe.”

This line of thinking seemed to perplex Cooper, and rightly so. Why would an immediate impact, such as inciting violence in an acute conflict, be wrong, but a deferred impact, such as harming the reputation of the woman who’s third in line for the presidency, be okay?

Once the content exists, Bickert implied, the company supports it as a tool to engender more content. “The conversation on Facebook, on Twitter, offline as well, is about the video being manipulated,” Bickert responded, “as evidenced by my appearance today. This is the conversation.” The purpose of content is not to be true or false, wrong or right, virtuous or wicked, ugly or beautiful. No, content’s purpose is to exist, and in so doing, to inspire “conversation”—that is, ever more content. This is the truth, and perhaps the only truth, of the internet in general and Facebook in particular.

Some journalists, commentators, and observers seemed to empathize with Bickert’s position. “Think about the implications if they did delete it,” the venture capitalist Kim-Mai Cutler said on Twitter. “Would you want this company being the arbiter of truth of billions of videos a day?” The University of California at Irvine law professor David Kay noted that it’s not so easy to “draft the rule that prohibits [the] doctored Pelosi video but protects satire, political speech, dissent, humor, etc.” And the New York Times technology journalist Farhad Manjoo invited suggestions for a “specific policy Facebook should adopt to remove this video but not other edited videos,” suggesting that the answer was hardly obvious.

These interventions are telling, because they take for granted that a simple or juridical process is necessary or desirable for Facebook to operate. They seek general rules rather than specific actions. But Facebook is not a court, or a state, or—by its own insistence—even a media company subject to defamation or libel laws. That means that Facebook can do whatever it wants, anytime it wants. It can take down breastfeeding posts if it thinks they contain nudity, which it can decide it doesn’t want on its platform. It can take down pages for alleged copyright infringement, no matter the veracity of those claims, because the Digital Millennium Copyright Act’s safe-harbor provisions protect corporate overreach. And yes, it can continue to disseminate a video that dangerously misrepresents the speaker of the House just because it feels like it.

If it chose to do so, Facebook could also remove the Pelosi video for no reason whatsoever, or for an official reason that might make as little sense as the rationale for retaining it. Facebook is a private company in the business of capturing and harnessing public attention. Given sufficient reason not to remove popular content, Facebook would like to benefit from the exchange of symbols and ideas about that content. That’s hardly a novel bit of knowledge about how Facebook operates, but Bickert confirmed the matter in an official way during her CNN appearance. That’s the conversation.

A bigger revelation came when Cooper asked Bickert why the company was comfortable removing more than 3 billion fake Facebook accounts from October 2018 to March 2019 but not a “clearly fake video.” Bickert cited Facebook’s long-standing rule that accounts must correspond with a single, real identity, and she noted that fake accounts are also more likely to distribute misinformation. But Cooper still couldn’t see the difference. “You’re in the news business,” he pleaded. “There’s a responsibility that comes with that.”

The sentiment makes sense to a journalist interested in taking responsibility for the information he puts into the heads of the citizenry. But Facebook ascribes to none of that responsibility, so Cooper’s appeal to journalistic integrity falls flat. “We have a site where people can come and share what they think—what’s important to them,” Bickert responded. The News Feed is not a feed of news, in the sense of materials created and disseminated to help citizens make decisions in a democracy. It’s a list of things that people find relevant and engaging enough to post and click on. A remarkable company statement to The Washington Post after the Pelosi video appeared makes that position even clearer: “We don’t have a policy that stipulates that the information you post on Facebook must be true.”

That might be bananas, but it is central to understanding how Facebook works. The Pelosi video exemplifies two different philosophies of fakeness, and that conflict is at the heart of every dispute about veracity on the service.

For Cooper and many others, truth is what’s at stake in defining phoniness. The video is considered “fake” because media’s purpose is to depict the world with veracity, or at least with earnest credibility. For a newsmaker, video is created by pointing a camera at a subject, capturing light and sound through lenses and microphones, and mustering that material as evidence in an argument or notice about what is happening in the world. In that case, editing or manipulating the source material to serve other ends would undermine its appeal to veracity. To a journalist, the Pelosi video is construed as “fake” because it depicts a situation that did not take place, while exhibiting the trappings and function of media meant to do the opposite.

A failure of accuracy is one kind of fakeness, although that goal has always been a dubious proposition—video has never really captured truth. But most media don’t aspire toward truth. In those cases, calling something a “fake” makes little sense at all. That’s where Facebook seems to stand on the matter.

When creating a film or a television program or even a YouTube post, video is just a raw material for subsequent composition, through post-processing, editing, collage, or any other set of means. A video created by sampling pictures or sounds of Pelosi for aesthetic instead of political effect would hardly be a “false” one. It’s not even fictional—it’s a video made from Pelosi, not a video of her. But the distinction between fiction and nonfiction matters much less to Facebook than it does to Cooper. The manipulated Pelosi video is still a “real” video, a sequence of moving images and synchronized sounds that plays back on a television, computer, or smartphone screen. This is part of the disagreement that Cooper and Bickert rehearse in the CNN segment. Cooper, the journalist, can’t understand why Bickert won’t see the video as propaganda posing as journalism. Bickert, the social-media executive, can’t grasp why Cooper is able to see the video only as journalism or propaganda, and not just as content, that great gray slurry that subsumes all other meaning.

To fail to distinguish between the different uses is to commit a massive category error, of course. A video created for news is different from one created for propaganda, for art, for expression, for entertainment, or for instruction—to name but a few of the possible uses of the medium. But Facebook can’t be bothered to understand and distinguish between all those uses. The company has claimed that the sheer volume of material people post makes almost any distinction impossible. That argument has some merit, so long as one accepts the necessity of a global social network in the first place—an increasingly dubious idea. But in a case like the Pelosi video, Facebook absolutely could make a decision to eradicate the content if it chose to. It might avoid doing so, perhaps for political purposes. After all, conservatives have been accusing social networks of bias of late.

Facebook is the corporate equivalent of a philistine. It just can’t be bothered to even ponder making distinctions between videos as news, as propaganda, as family albums, or whatever else. Instead, as Bickert explained to Cooper, the facticity of the source of content becomes the most important proxy for value. A “real” person, in Facebook’s eyes, is a legal entity, not just a row in a database done up to look like a person. It’s easy to see why that perspective might be appealing to the company: Despite Mark Zuckerberg’s claims that Facebook is a community, its users are consumers, not citizens. Legal persons have actual lives, meaning that advertising messages might drive them to make actual purchases. Data that Facebook buys or that partners load into the service might improve marketers’ ability to target users as consumers. And actions that those consumers perform on and off the service might increase their incremental value as advertising targets for Facebook.

For that process to work, it doesn’t much matter if these users—citizens of somewhere though they be—create, like, or share “true” videos (photos, posts, links) in the journalistic or documentarian sense, or if they interact with “false” material that also produces a trail of new insights for improving targeting and increasing engagement. Since real people are liking and sharing it, all of that content is real. None of it is fake. Some of it is lies, a lot of it is stupid, and much of it is harmful. But none of it is fake.

Facebook has many failings, but the misconstruing of fakeness on its service must be attributed to us, its users and critics. We’ve fooled ourselves into believing that we have some common ground with Facebook, rooted in a responsibility to something larger than ourselves. Anderson Cooper looks like an earnest, hard-line journalist when he challenges Monika Bickert, who seems to take the mantle of the sneaky tech executive, on Facebook’s latest failing. But that posture is noble only among a community that agrees that the purpose of content like the Pelosi video is to tell the truth. That is not the case here.

When Facebook says it’s not a news company, it doesn’t just mean that it doesn’t want to fall under the legal and moral responsibilities of a news publisher. It also means that it doesn’t care about journalism in the way that newsmakers (and hopefully citizens) do, and that it doesn’t carry out its activities with the same goals in mind. And yet rather than understanding and responding to those truths, public discourse instead beats its head against a wall trying to persuade Facebook that it “can’t do” what it’s been doing with impunity. It would be much simpler and more productive to take the company at its word, since reforming it into a responsible actor concerned first with its responsibility to truth or citizenship is likely impossible.

It’s like in those Looney Tunes cartoons with Wile E. Coyote and the Road Runner. The coyote paints over the bluff of a cliff to make it look like a tunnel or a roadway, anticipating that the bird will smash into it unaware and meet his end. But every single time, the coyote fails to realize that the roadrunner plays by different rules. The bird is just a representation, after all, a drawing no different from the picture on the side of a cliff. And so it passes through unscathed, every time. Meanwhile, the coyote, who never learns that he cannot overcome his own belief in the falseness of the tunnel—a way of thinking that the roadrunner never even thinks to think—always smashes into the rock when he gives chase. And even so, he keeps coming back for more.