Meet the Woman Who Warned About Russian Election Meddling Years Ago—and Got Death Threats

Ann Ravel eventually left the Federal Election Commission because it was so dysfunctional.

Ann Ravel eventually left the Federal Election Commission because it was so dysfunctional. Rich Pedroncelli/AP

She was worried about the government turning a “blind eye” to the growing force of the internet in politics.

Calls for more transparency and regulation governing the content and advertising on Facebook are suddenly coming from both the right and the left in Washington, and are likely to increase as more information emerges about how the company earns nearly all of its almost $30 billion in annual revenue.

The attention has intensified since Facebook recently admitted that Russian buyers were able to purchase thousands of ads on its platform on hot-button issues like immigration and gay rights in the run-up to the US election. It’s also been revealed that its policing of ads is so lax that it was possible to buy ads targeting users interested in topics like “Jew hater” and “how to burn Jews,” ProPublica reported. (The topics have been removed since). Facebook isn’t alone—until recently, on Google, it was possible to target messages to people who search for phrases like “blacks destroy everything,” BuzzFeed reported.

Long before Russia meddled in the 2016 US presidential election, and before lawmakers in Washington DC were scrambling to figure out how to rein in Facebook, a small federal agency was wrangling with how to regulate the growing power of the internet in political elections.

From the mid 2000s until last year, the Federal Election Commission (FEC) was locked in internal debates about the money being poured into political advertising and campaigning on the web, and how or even whether it should be disclosed. After all, the US had for decades held television broadcasters up to strict standards, dictating how sponsors of political advertisements should be identified, requiring television stations to ferret out which candidates third parties were working for, and forcing them to make public lists of such backers. Shouldn’t the internet be held to the same standards?

In October 2014, vice commissioner Ann M. Ravel wrote a statement accusing the FEC of turning a “blind eye” to the growing force of the internet in politics, and explaining the reason she and two of her co-commissioners, all Democrats, had voted for more disclosure of funding of political material on the web:

Some of my colleagues seem to believe that the same political message that would require disclosure if run on television should be categorically exempt from the same requirements when placed on the internet alone. As a matter of policy, this simply does not make sense. … This effort to protect individual bloggers and online commentators has been stretched to cover slickly produced ads aired solely on the internet but paid for by the same organizations and the same large contributors as the actual ads aired on TV.

The FEC had just undertaken a vote on the topic that ended in a deadlock, with three Republicans voting against their Democratic colleagues, a common impasse in the increasingly dysfunctional agency tasked with keeping US elections fair and transparent. Nonetheless, Ravel’s statement sparked outcry and anger, especially from conservatives who equated money spent on political advertising on the internet to “free speech”—the same argument that won the landmark 2010 “Citizens United” Supreme Court case, sending a torrent of cash into political elections.

The end result was, for the coming years, the commission would do nothing to address who was spending money on political advertising on the web, even as Facebook’s audience and influence grew in the US.

Bring on the trolls

A day after Ravel published her statement, co-commissioner Lee Goodman, a Republican, appeared on Fox & Friends (video) to warn that the three Democrats wanted to censor free speech online, and set up a “regulatory regime” that would reach deep into the internet. “Boy, I thought Democrats were for free speech,” commentated the Fox anchor interviewing him, Tucker Carlson. “That was obviously an earlier species.”

Ravel says Goodman’s Fox appearance unleashed a torrent of abuse. The issue was picked up by Drudge Report, Breitbart, and other right-wing news sites, which singled her out. Responses poured in from Twitter and e-mail, ranging from death threats to misogyny, everything from “stick it up your c-nt,” she recalled this week, to “You’re the kind of person the Second Amendment was made for.” They also included “Hope you have a heart attack,” and “You will more than likely find the ‘Nazi’ scenario showing its ugly head,” the Center for Public Integrity, a nonpartisan group that investigates democracy, reported (Ravel is Jewish).

Such threats, while appalling, hardly seem novel in America right now. But this was in 2014, eight months before Donald Trump even announced he was running for president in a campaign that was accompanied by a big spike in anti-Semitic, white nationalist threats online. Michael Toner, a Republican former FEC commissioner called the harassment of Ravel “incredibly inflammatory stuff.”

The FEC took the threats so seriously that an event Ravel had planned—to bring together “technologists, social entrepreneurs, policy wonks, politicos, and activists” from across the political spectrum to discuss how political advertising on the internet should be regulated—was cancelled.

The bumper sticker argument

Advertisers are drawn to Facebook and Google because of their massive reach, but also because of their ability to pinpoint the people most likely to buy their products. Both companies rely on algorithms to target ads to certain users, but don’t disclose much about how these algorithms, so how the process works is particularly opaque.

In Facebook’s case, the lack of disclosure in the US about who is buying political ads, and which of the company’s 2 billion users see them, has its roots in a creative legal strategy.

Facebook successfully argued that the company should not have to disclose who was behind political advertisements because the commission did not require political committees to include disclaimers on buttons, bumper stickers, or other small items where it “could not easily be printed,” in this 2011 letter (pdf, pg. 8) seeking clarification from the FEC. Its advertisements were tiny, Facebook argued, and purposely limited to 25 characters in the headline and 135 in the body. “Changing the size or format of these ads would cause a significant disruption to Facebook’s basic advertising,” the company argued. Additionally, the FEC has allowed Google to run Google search ads from political committees without disclaimers, Facebook argued.

Facebook’s revenue from advertising has skyrocketed since then, and the company is engaged in a public and constant battle to keep promoters of everything from pornography to bogus diet polls from using its site to make a profit. While it has been openly policing those things for years, it said little about political advertisements until after an internal investigation in the wake of the 2016 presidential election. “We have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people,” the company said (pdf) this April. It has announced a series of tweaks since to fight misinformation.

The 2011 letter, which asked the FEC for more guidance, resulted in “a lengthy and detailed discussion among FEC commissioners in the hopes of handing down an official opinion,” wrote ClickZ, a trade publication covering the digital marketing industry after the letter became public.Like so much else at the commission, the FEC discussion “resulted in a non-decision,” ClickZ noted.

Anticipating Putin

Despite the backlash to her 2014 push to get Facebook and other internet companies to be more transparent about where their ad revenue was coming from, Ravel kept pursuing the issue. In 2015, the FEC grappled with the topic of how to make sure foreign money wasn’t being used to pay for political advertisements on the internet, a clear violation of a federal law.

In doing so, Ravel even anticipated Putin’s influence. “I mean, think of it, do we want Vladimir Putin or drug cartels to be influencing American elections?” Ravel asked in an October of 2015 meeting, while pushing for the commission to require state and local campaigns to declare foreign contributions. The commission tried once again to hash out what was “local” or “national” given the internet’s global reach.

Once again, though, the FEC settled on doing nothing. In February of this year, Ravel left the commission before her term was up, saying at the time (paywall) that the commission was so dysfunctional that she thought she could do more to improve transparency and fairness in US elections outside the government.

The demonization of Ravel and the FEC’s overall failure to write rules helped create the disturbing situation the United States finds itself in now.

Multiple US government bodies are likely to spend tens of millions in taxpayer dollars investigating how the Kremlin may have influenced the 2016 presidential election, particularly via big social media platforms like Facebook. The FBI’s special prosecutor is currently looking at how Russia may have used Facebook to spread misinformation and propaganda on the web, while Facebook and Twitter executives may be hauled in front of Congress to explain the same.

More broadly, social media is helping to polarize the country—allowing right-wing extremists and hate groups to spread their messages in new ways, and organize anti-immigrant and neo-Nazi demonstrations (sometimes the organizers are even Russian-backed). This is happening while some 45% of Americans use Facebook to get their news.

In a surprise vote that could mark a first step towards fixing the situation, the usually divided FEC voted unanimously on Sept. 14 to reexamine the rules surrounding disclosures on online ads. “For our democracy to work, the American people need to know that the ads they see on their computer screens and in their social media feeds aren’t paid for by Russia or other foreign countries,” commissioner Ellen Weintraub wrote after the vote.