It’s yet another attempt by a government to use Facebook to sow discord in the United States.
Facebook announced its latest takedown of what the company calls “coordinated inauthentic activity”—and this time, the propaganda network didn’t originate in Russia à la the 2016 election, but in Iran.
About a million people followed one of 82 suspicious pages, groups, and accounts on Facebook or Instagram. One of the largest, “No racism no war,” had more than 400,000 page likes before Facebook took it down, according to tracking by CrowdTangle, a Facebook-owned social-media analytics service. It grew rapidly this spring and into the summer, adding more than 150,000 likes from May to August.
According to Facebook, “a manual review” of “No racism no war” and the other accounts “linked their activity to Iran,” though, at this time, not to the Iranian government. The various pages generally posted about “politically charged topics such as race relations, opposition to the President, and immigration,” Facebook said.
So why did “No racism, no war” become so popular? One reason is Tom Hanks. In July, a doctored image of Hanks was shared 95,000 times. It was the top-performing post for the page in that month.
Of course, Tom Hanks himself had nothing to do with it. And how he was used to help promote this page is a fascinating parable about the weirdness of today’s digital environment.
In June of 2015, the U.S. women’s soccer team was playing in a World Cup semifinal against Germany. Hanks tweeted about it, referencing the 1980s movie Bosom Buddies and including a selfie in the team’s jersey.
Fake turf or not, I'm full Bosom Buds if coach Ellis needs me. No yellow cards! Hanx pic.twitter.com/4eT5cYzmjD— Tom Hanks (@tomhanks) June 29, 2015
That was the end of Hanks’s involvement. A couple of years later, a variety of people started to doctor the image by simply Photoshopping different designs over the jersey. It became an easy generative meme thing, like someone holding a white sign.
So, in July of 2017, a Facebook user named Andre Lightner posted the meme with Hanks wearing a litany of popular lawn-sign social-justice statements (“Science Is Real,” etc.), according to Snopes.
Somebody in Iran then scooped up that image and held on to it for the next year, before unleashing it in July with no caption or anything else. Just the image. With the audience that “No war no racism” had built with Facebook videos and other memes, it was a hit, and off the image went, scooting around Facebook and helping the page pick up followers. A soccer tweet by Oakland’s most beloved actor had become part of a covert Iranian propaganda campaign.
As with Russia’s efforts in 2016, the easy line, used by politicians of all political persuasions, is that these campaigns are supposed to sow discord by ratcheting up the pressure on existing American disagreements about race and immigration.
But it’s never been clear if these efforts are effective, or even what effectiveness would mean.
The page’s most popular post in March highlighted a police shooting of a black man in his own home in Sacramento. In April, the most popular “No racism, no war” post in CrowdTangle’s archive was about a 7-year-old black child who was pulled from a bus. In May, the most popular post was a link to a news story about two black men killed in Oklahoma. In June, the most popular post read: “No one is illegal on stolen land.” And then, July was Hanks.
Iran and Russia’s propagandists are certainly not committed civil-rights activists, and that inauthenticity is Facebook’s official rationale for taking down the pages. The company stressed in a blog post and a call with reporters that it targets the type of activity, not the nature of the content.
These pages are clearly effective at reaching millions of people. Their content strategies—no matter whether they are targeting the right or the left wing—mirror hundreds of “authentic” Facebook pages.
For an American scrolling down the News Feed, a tiny percentage of what you would encounter is foreign propaganda. What’s strange is how hard it is to differentiate what trolls in St. Petersburg and Tehran produce from the everyday postings of uncles and cousins, small-time media outlets, newfangled activists, and companies selling T-shirts. The pages Facebook took down today have been operating in plain sight during a time when Facebook is on high alert looking for exactly this kind of operation.
But in the great munging of all content through Facebook, the slurry mostly looks the same, even to Facebook.