Why the Russia Fake News Scandal Hasn’t Touched Snapchat

Ink Drop/Shutterstock.com

Snapchat is deliberately much stricter than other social media companies about which news outlets it partners with.

Facebook’s Instagram has been ruthlessly copying features from Snapchat since the ephemeral messaging app rebuffed Facebook’s $3 billion-dollar takeover offer in 2013. But when it comes to what publishers and advertisers can publish on their platforms, Snapchat and Facebook couldn’t be more different.

That helps explain why Facebook is being investigated by Congress for its role in the Russian government meddling in the last US election, and faces increasing public scrutiny over the spread of fake news and misleading information on its platform.

Snapchat, on the other hand, has not found any evidence that the Russian propaganda machine tried to buy ads on it app to spread misinformation, or sway US voters, company executives say. This sets it apart from Facebook, but also from PinterestTwitterGoogle, and even Pokemon Go. The app has also become one of the most reliable sources of on-the-ground information during a tragedy.

That may be a small consolation to investors. Snapchat is on track to lose $1 billion this year, as Instagram’s copy-catting eats into its user growth, while Facebook made nearly $4 billion in profit in the second quarter of the year, nearly all of that from advertising.

The absence of fake news and Russian interference on Snapchat is partly down to how the app operates—its 173 million daily users have an average of about 30 friends who they send snaps to regularly, rather than hundreds as they do on Facebook. You can’t include links, and you can’t look at a private account’s snaps. That makes it less of a connected network and more like a collection of closed groups, meaning it is much harder for anything to go viral.

But Snap is also deliberately much stricter than other social media companies about which news outlets it partners with and promotes, and how it reviews political and advocacy ads that appear there.

Snap vets its “publishing partners,” the media companies that it puts on its news feature called Discover, and so far has only a few dozen. They are mostly mainstream news organizations, from the Economist to BuzzFeed to Vogue.

On Facebook, anyone with an account can start a page, and declare themselves a “media/news company.” They can become verified by Facebook if they have a publicly-listed phone number, and get a “blue check” of authenticity if the platform verifies that they are in fact the well-known brand they claim to be. That means there are thousands, and maybe tens or hundreds of thousands, of “media companies” on there.

Snap is looking for responsible journalism, rather than what attracts the most eyeballs, executives say, and has turned down potential publishing partners who don’t meet its standards. The seriousness of this journalism varies widely—on Friday Discover featured a Wall Street Journal explainer about Trump’s tax-cut plan… and the Daily Mail’s (safe for work) snaps of a Sports Illustrated model losing her bikini top.

Part of the company’s relative caution may reflect problems Snapchat had in the past. Snapchat was hit with a class-action lawsuit last year, accusing some publishing partners of putting sexually explicit content on the platform, which is used heavily by teens. In January, it introduced new rules that require publishers to flag sexually explicit material, fact-check articles, and not post misleading or deceptive content.

In the wake of the 2016 US election, Facebook set up a fact-checking network to check news stories on the platform that users flagged as fake. But its scope is limited because it doesn’t allow users to flag videos and relies on a tiny group of third-party fact-checkers.

Nor does Facebook ban publishers with a history of spreadingconspiracy theories and misleading information. Its chief operating officer, Sheryl Sandberg, summed up the company’s overall stance in an interview this week: “A lot of what we allow on Facebook is people expressing themselves,” she said, “that means you allow other people to say things that you don’t like and go against your core beliefs.”

The difference in philosophy is noticeable in how the two platforms treat advertisers. On Snapchat, any ads deemed as political or advocacy ads are reviewed by a human account manager, and include information about who paid for them.

Much of Facebook’s advertising process, by contrast, is completely automated, and while it has promised more transparency about who is paying for political advertising, Sandberg made clear in her interview that it would not block political ads even if they include lies that seem designed to provoke.

For example, she said, Facebook would have allowed a controversial adon Twitter by Senate candidate Marsha Blackburn, which included the lie that Planned Parenthood was selling fetal tissues—a claim refuted by multiple, bipartisan investigations. “We allow issue-based ads even when they’re hard,” Sandberg said. A similar advertisement would be unlikely to make it though Snapchat’s review process.

Admittedly, Facebook is acting no differently from US broadcast television or radio station in years past. Lying is considered perfectly legal in US political advertising, and the federal government has a long history of doing nothing to stop it. The Federal Trade Commission’s “truth in advertising” rules only apply to commerce—goods or services that are bought and sold—while political ads have been deemed “free speech” in several court decisions over the years.

Still, Snapchat seems to be banking on the hope that its users want material they can trust. The question is whether that will save its business.