The Environmental Toll of a Netflix Binge

Denys Prykhodov/Shutterstock.com

The data centers that support the Internet use a huge amount of energy.

My favorite part of highways in the American midwest are the wind turbines. Not so much the massive wind turbines on the horizon, begging for a contemporary Don Quixote to come at them, but the bits and pieces of wind turbines—single fan blades and bits of foundation—that pass by on longhaul trucks. They’re like an exploded diagram on wheels, a useful reminder of the sheer scale and complexity entailed in building energy systems.

Data-center operations managers love to talk about energy systems—or, more specifically, efficiency in their use of energy systems. The fact that companies increasingly foreground this sustainability information when engaging with journalists demonstrates a growing public interest in The Cloud’s environmental impact. But really articulating and measuring that impact is difficult. It basically requires taking apart the constituent parts of the data center itself.

Starting around 2006, power-usage effectiveness (PUE) has been the metric of choice for the data-center industry. PUE is a ratio created by the Green Grid Association, a consortia of IT industry professionals dedicated to “accountable, effective, resource-efficient, end-to-end ICT ecosystems.” It’s a fairly simple measurement of a data center's total power usage to the power usage of its actual IT equipment. It's more a metric of efficiency than of environmental impact.

When I asked Nathan Ensmenger (an informatics professor at Indiana University currently working on a book on the environmental history of computing) about PUE, he offered a pretty blunt assessment. “PUE is [an] atrocious metric. It ignores everything that’s interesting and problematic—basically, if you turn the lights off, you're operating at a hundred percent efficiency.”

PUE only looks to the internal operations of the data center—not, say, where its energy comes from or how much energy they’re using and what that looks like proportionately. Admittedly, accessing some of those metrics is tricky—while Google is happy to report that its data centers account for a scant 0.01 percent of global electricity use, what does that mean in actual kilowatt-hours?

Here’s where things get a bit back-of-the-napkin: According to the U.S. Energy Information Administration, in 2012 global electricity consumption was 19,710 billion kilowatt-hours. Using Google’s 0.01 percent estimate and electricity-consumption data from the CIA World Factbook, they’re using about as much electricity annually as the entire country of Turkey. (Honestly, that number seems impossibly high considering that in 2011 Google disclosed that it used merely 260 million watts of power, at the time noted for being slightly more than the entire electricity consumption of Salt Lake City.) In its 2013 sustainability report, Facebook stated its data centers used 986 million kilowatt-hours of electricity—around the same amount consumed by Burkina Faso in 2012.

Those are some of the metrics used by Greenpeace in their series of reports and campaigns around the environmental impact of data centers. They’ve had a pretty significant role in changing discourse around this issue since they began campaigning around it in 2010.

“When we first started highlighting [Facebook's] data centers running on 50 to 60 percent coal they were saying, ‘Oh that's not fair to say we’re powered by coal, we just buy electricity off the grid like everyone else. But we're really really efficient, why don't you focus on how efficient we are instead of talking about this coal stuff?’” said Gary Cook, a senior IT analyst at Greenpeace. But the company “eventually came around to agreeing that they have a responsibility and an opportunity to drive growth with renewable energy and prioritize access to renewables as they continue to grow.” In 2011, Facebook became one of the first major tech companies to commit to operating on entirely renewable energy, in no small part because of Greenpeace’s campaigning.

Of course, the decision to use renewable energy is as much a business calculation as it is a public-relations one. Tech companies aren’t underwriting wind turbines for their data centers or embracing renewable energy solely out of some altruistic concern for the environment or fear of bad press. If that were the case, they'd be using their billions in profit to build wind turbines to power the grids of entire cities, not just canceling out their own environmental impact.

The price of renewable-power sources are far less volatile than power derived from commodities like coal, which is useful for things like long-term budgeting and cost-cutting. Remember, data centers aren't so much centers of technical operations but of logistics—that is, the manufacture and movement of commodities.

“Often people think of [data centers] as almost like cathedrals of servers. Very clean computer equipment, white walls and things—the reality is, these are factories,” said Tate Cantrell, CTO of data-center company Verne Global. Cantrell’s company runs a data center in Iceland, a location with incredible renewable-power resources and existing industrial infrastructure. The “industrial” part is key here.

“The aluminum smelters are what really focused our attention,” Cantrell explained. Aluminum is a commodity whose price is largely defined by the price and reliability of electricity, which is crucial to extracting alumina from raw materials. Aluminum is a growing industry in Iceland (as of last year they were the 11th-largest producer), and its smelters connect to an extremely reliable grid whose power comes from geothermal and hydroelectric sources. Verne Global is connected to that same grid, which provides the company with relatively long-term price predictability.

Of course, moving all the data-center companies to Iceland wouldn’t address the systemic problems that keep many data centers running on coal power. In the United States, that problem lies mainly in the hands of regional power companies.

“[Tech companies] are some of the biggest new areas of growth for utility companies in terms of energy,” Cook noted, so they have a lot of leverage to make those sorts of demands. And they’ve sometimes been able to actually facilitate major changes with that leverage. In North Carolina, Google pressured Duke Energy to create a new renewable-energy program. In Iowa, MidAmerican Energy’s decision to cancel a nuclear-power project conveniently coincided with Facebook’s investment in wind power in the state.

But that’s leverage available to companies operating at the scale of Facebook and Google. It’s not really something that smaller colocation services can pull off. Relative to the entire data-center industry—data centers run on university campuses, enterprise colocation providers, hospitals, government agencies, banks—companies like Facebook and Google are a pronounced, but still minor piece of the larger data-center landscape. Some smaller companies have been able to push for changes, but they tend to need one of the heavy-hitter companies to act as muscle first.

Switch, a Las Vegas-based data-center company, was able to pressure the state utility company to provide renewable power to their data center—but only after Apple paved the way by demanding it first. In most cases, companies that can convince utility providers to use more wind power are the ones who can effectively underwrite the cost of building more wind turbines.

And while underwriting the cost of a wind farm is laudable, it only addresses one kind of environmental impact, focused on one particular set of metrics. It doesn't address things like water usage, pollution from backup generators (which Microsoft, in a now-infamous 2012 incident, ran at full tilt at their Quincy, Washington, data center), supply chains for the rare earth minerals used in hardware, and the toxic materials involved in the production of this hardware.

“The tech industry is really good at erasing its connections to these more mundane concerns like energy and electricity and maintenance,” Ensmenger noted, highlighting that Silicon Valley has 23 Superfund sites alone as a result of industrial-hardware production.

The impact of data centers—really, of computation in general—isn’t something that really galvanizes the public, partly because that impact typically happens at a remove from everyday life. The average amount of power to charge a phone or a laptop is negligible, but the amount of power required to stream a video or use an app on either device invokes services from data centers distributed across the globe, each of which uses energy to perform various processes that travel through the network to the device. One study (weirdly enough, sponsored by the American Coal Association, its purpose to enthuse about how great coal is for technology) estimated that a smartphone streaming an hour of video on a weekly basis uses more power annually than a new refrigerator.

Both Cook and Cantrell argued that if the energy used by a computational process is renewable, the energy consumed by that process isn’t that big of a deal. And shaming consumers for their Netflix binges doesn’t exactly mobilize a base, and at the end of the day companies have a lot more agency to make technology choices that could lessen their environmental impact. Still, it seems weird that most people—most engineers building the platforms people use every day, even—lack the basic comprehension that different online activities have different energy impacts, or that an individual’s online activities have energy impact at all beyond a laptop’s battery life.

Which isn’t to say there isn’t interest. Ensmenger’s research was largely spurred by introducing this topic to undergraduate students in an information-ethics course. While students weren’t particularly enthusiastic about issues of privacy and surveillance, they were extremely interested in the environmental issues, in part because they had local concerns—Bloomington, Indiana, is home to three Superfund sites related to hardware production.

“Without question, it’s the most popular and engaging part of some of the courses that I teach,” Ensmenger said.

Contrary to a million thinkpiece ledes bemoaning a public that doesn't understand or care about the physical realities of The Cloud (some of which I am admittedly guilty of writing), there is an appetite for this material—but not, as Ensmenger put it to me, very much by way of nutritious food. If we want better, more sustainable technologies to be part of the future, these questions need to become part of the curricula taught to the people who will build them.

Ultimately, all of these varied metrics and environmental concerns introduce questions about time, maintenance, and scale that the tech industry doesn’t often face. The Cloud optimizes for real-time, not geologic time. While much of today’s network infrastructure was built atop the infrastructures of the 19th century, that infrastructure wasn’t necessarily built to last. While it’s amazing and wonderful and ridiculous that it has lasted, it can’t last much longer under the strain of exponentially expanding connectivity demands without first considering these long-term questions.

At one point in our interview, Gary Cook commented that “the Internet is the single biggest thing we’re going to build as a species. This is something if we build it the right way, with the right sources of energy, could really help power our transition to renewables. If we build it the wrong way, it could actually exacerbate the problem.”

While it might be a little hyperbolic to declare the Internet as civilization’s greatest achievement (I’m personally a fan of vaccinations), there’s no shortage of VCs and CEOs who believe and evangelize that position. But embracing such a grandiose statement about the network also means taking a radical responsibility for it and recognizing that how we build the future frankly determines whether we have a future at all.

(Image via /Shutterstock.com)