Can Facebook's 'I Voted' Button Really Skew an Election?

Adams County, Colorado voters casts their vote at a polling place in the Thornton Recreation Center, Tuesday, Nov. 4, 2014.

Adams County, Colorado voters casts their vote at a polling place in the Thornton Recreation Center, Tuesday, Nov. 4, 2014. Jack Dempsey/AP

Facebook believes that in 2010, its election-day module was responsible for more than 600,000 additional votes.

Open Facebook today and you’ll see a public service announcement of sorts.

“It’s Election Day,” proclaims the text. “Share that you’re voting in the U.S. Election and find out where to vote.”

Then Facebook offers you a button to do that sharing: “I’m a Voter.”

To entice you to Vote (or, at least, click that button), Facebook listed a couple friends’s names and some profile pictures, and told me that 1.8 million other people had already done the same. (Which is a little staggering, since polls hadn’t even opened on the West Coast yet.)

This civic-minded setup has become an election-day tradition on the website. Some form of the “I Voted!” button has appeared on the page for every major U.S. election since 2008. You vote, then you tell Facebook about it, exhorting your friends to engage in their civic duty.

These buttons, though, have also always been part of experiments. The voting button in 2010, for instance, was part of a study later published as “a 61-million-person experiment in social influence and political mobilization.” That study found that the voting button could shape who actually voted to a significant degree: If you’re told your friends have voted, you’re 0.39 percent more likely to vote than someone who hasn’t. Facebook believes that in 2010, its election-day module was responsible for more than 600,000 additional votes.

In other words, to paraphrase Harvard professor Jonathan Zittrain , the 2000 presidential election—where George W. Bush won Florida by 537 votes—could have been altered by a Facebook election button.

But in order to do these kinds of experiments, Facebook has to create a control group—which means only showing the button to some users, and not to others. In 2012, Facebook said that it would make the button available to every user, as a kind-hearted effort to increase voting all around. Except that it never actually did, and instead continued the testing. As revealed by Micah Sifry in a feature at Mother Jones last week , the 2012 election button played host to many experiments:

Most but not all adult Facebook users in the United States had some version of the voter megaphone placed on their pages. But for some, this button appeared only late in the afternoon. Some users reported clicking on the button but never saw anything about their friends voting in their own feed. Facebook says more than 9 million people clicked on the button on Election Day 2012. But until now, the company had not disclosed what experiments it was conducting that day.

A Facebook vice president told Sifry that the experiments were conducted primarily to find out if changing the text of the share button—from “I’m a Voter” to “I Voted” to something else—affected how many people clicked it. We won’t be able to confirm anything about those experiments, though, until next year, when the academic results of the experiment come out. And even then, it’s likely that Facebook conducted other user tests that it will never publish in an academic journal.

Why does this matter? Facebook can already figure out so much about us: our politics, our income, our sexual orientation— even when we’re about to fall in love . As Zittrain wrote earlier this year , the company could easily combine that tranche of data with selective deployment of its “I Voted” button and tilt an election. Just make certain populations more likely to see the button, and, ta-da: modification managed.

And Facebook, frankly, may be altering elections already. Social networks skew young and female : two reliably progressive-leaning demographics. Even if Facebook distributed the button equally to its users, it might still bring more liberal users to the polls than conservative ones.

At the end of his piece, Zittrain proposes the FCC regulate the company’s “I Voted!” button, as it did with subliminal messaging in the early 1970s. Some conservatives might balk at such a suggestion. Maybe they’re right. But until American society better understands Facebook’s ability to skew elections and everything else, they might be the ones who suffer the most tangible losses.