Facebook Has All the Power

JaysonPhotography/Shutterstock.com

Media critic Jay Rosen says a controversial mood study shows how few rights and how little control Facebook users actually have.

Amid growing calls for formal investigations into Facebook's disturbing mood manipulation research, media scholar Jay Rosen has a reminder for journalists, editors, and personal social media users alike: "Facebook has all the power. You have almost none."

The experiment, conducted without users' knowledge or consent, manipulated the News Feeds of nearly 700,000 Facebook users with the purpose of testing mood responses to content alteration. 

The Federal Trade Commission is considering two formal complaints about the 2012 Facebook research, published contentiously in the Proceedings of the National Academy of Sciences journal—one from US Senator Mark Warner and another from the Electronic Privacy Information Centre. Meanwhile, in the UK, the Information Commissioner's Office is investigating

I caught up with Rosen, who teaches Journalism at NYU and is also an advisor to First Look Media

* * * 

As a journalism and media studies professor, what do you think universities need to do to ensure higher ethical standards are applied to human research targeting social networks?

The problem is that scholars covet thy neighbor's data. They're attracted to the very large and often fascinating data sets that private companies have developed. And for good reason. There could be a lot of discoveries hidden in there! It's the companies that own and manage this data. The only standards we know they have to follow are in the terms-of-service that users accept to create an account, and the law as it stands in different countries.

I believe it was the "sexiness" of the Facebook data that led Cornell University and the Proceedings of the National Academy of Sciences (PNAS) into an ethically dubious arrangement, where, for example, Facebook's unreadable 9,000-word terms-of-service are said to be good enough to meet the standard for "informed consent."

When the study drew attention and controversy, there was a moment when they both could have said: "We didn't look carefully enough at this the first time. Now we can see that it doesn't meet our standards." Instead they allowed Facebook and the PR people to take the lead in responding to the controversy. The moment for reflection was lost. Cornell (home for the two scholars who collaborated with Facebook) is saying it did everything right. PNAS is saying it has "concerns," but it too did everything right.

While we know that Facebook has access to our content and our rights as users are increasingly being diminished, there is something particularly creepy about the knowledge that, without our consent, in an Orwellian fashion, we can be unwitting participants in psychological experimentation. What should this reality signal to Facebook users? Is it time to pull-back?

You have (almost) no rights. You have (almost) no control. You have no idea what they're doing to you or with you. You don't even know who's getting the stuff you are posting, and you're not allowed to know. Trade secret! As the saying goes: "If you're not paying for the product, you are the product." As long as you understand and accept all that, then proceed. With caution.

Are there any particular warnings here for journalists and editors in terms of their exposure on Facebook?

Yeah. Facebook has all the power. You have almost none. Just keep that in mind in all your dealings with it, as an individual with family and friends, as a journalist with a story to file, and as a news organization that is "on" Facebook.

Did you ever hear that line, "don't pick fights with people who buy their ink by the barrel?" That was a statement about power in a previous media age. Now Facebook is in that position. They buy their ink by the barrel. So don't think this is anything but a relationship among unequals.

Some journalists and editors have told me that they're thinking of closing their Facebook accounts in the wake of this scandal - what's your response to that reaction? Would you consider that course of action now?

Yes, I have considered it. And I may do that one day. I have 180,000+ subscribers on Facebook but I barely use it. I can go for a week or two without logging in. I post photos I am proud of occasionally, and sometimes links to my own work. Last week I posted a lot on Facebook about the issues we are discussing now, using the platform to air criticism of it.

But what I do every day on Twitter—curate links and comment in the area of my expertise, adding value to the system for free because I get something back—I will not do on Facebook because of the opacity of its algorithm. Facebook thinks it knows better than I do what those 180,000 subscribers should receive from me. I find that unacceptable, though I understand why they do it. I am not in a commercial situation where I have to maximize my traffic, so I can opt out. Right now my choice is to keep my account, but use it cynically. 

What are the implications of this revelations for content producers—we know Facebook manipulates the presentation of content in news feeds and there is the possibility of buying influential positions in news feeds by 'boosting content' - but does this level of experimentation indicate the prospect of a further undermining of audience-driven news priorities and traditional news values?

The right way to think about it is a loss of power—for news producers and their priorities. As I said, Facebook thinks it knows better than I do what "my" 180,000 subscribers should get from me. Of course it was their algorithm and their choices that helped me get those 180,000 subscribers to my public account. But this just underlines the point: they have all the power. If I want some of it back I have to pay money for "reach." That's cynical, on their part, which is why I use the platform cynically.

What are the long-term implications of these power statements? And how should Facebook change in response?

Forgive a slight tangent. In the 1960s in the U.S. it was not unusual for metro newspapers to have 80 percent market share or more. By the 1990s it was under 50 percent in some places. But newspapers kept raising their rates for advertisers, who had to pay more to reach less of the market. The logic was: where else are they going to go? Well, eventually an answer to that question emerged—Google, Facebook—and newspapers discovered how much loyalty they had built up among advertisers.

Facebook has "where else are they going to go?" logic now. And they have good reason for this confidence. (It's called network effects.) But "where else are they going to go?" is a long way from trust and loyalty. It is less a durable business model than a statement of power. 

In my piece for the Washington Post on these events, I distinguished between the "thin" legitimacy that Facebook operates under and the "thick" legitimacy that the university requires to be the institution it was always supposed to be. (Both are distinct from il-legitimacy.) News organizations should learn to make this distinction more often. Normal PR exists to muddle it. Which is why you don't hand a research crisis over to university PR people.

I think Facebook too should be thinking about "thick" vs. "thin" legitimacy, but I see no sign of that and do not expect it to happen soon. Some would even argue that a public company in the U.S. cannot afford "thick" legitimacy. I think that's wrong but plenty of corporate lawyers and CEOs would disagree with me.

Here's how we will know when the situation is changing. The first Internet company that succeeds big—like, culturally big—with terms-of-service that are readable, understandable, equitable, and humane will signal a shift. We are not there yet.

Finally, some commentators have questioned the practice of A/B headline testing in the aftermath of this scandal—is there a clear connection?

The connection to me is that both are forms of behaviourism. Behaviourism is a view of human beings in which, as Hannah Arendt said, they are reduced to the level of a conditioned and "behaving" animal—an animal that responds to these stimuli but not those. This is why a popular shorthand for Facebook's study was that users were being treated as lab rats.

One of the things that disturbed me about the episode was a milieu of casual behaviourism among Facebook workers, who don't seem aware that this view of other people is enormously reductive. Also, it has effects on the viewers themselves.

Journalism is supposed to be about informing people so they can understand the world and take action when necessary. Action and behaviour are not the same thing at all. One is a conscious choice, the other a human tendency. There's a tension, then, between commercial behaviourism, which may be deeply functional in some ways for the news industry, and informing people as citizens capable of understanding their world well enough to improve it, which is the deepest purpose of journalism. A/B testing merely highlights this tension.


This article was first published by the World Editors Forum.

(Image via JaysonPhotography / Shutterstock.com)