Chances are you’ve heard the song “Despacito,” by Puerto Rican artists Luis Fonsi and Daddy Yankee. Its distinction as the most-watched YouTube video of all-time suggests it was unavoidable when the song was released in 2017.
Nearly a year ago, when the video became the first clip to pass 5 billion views on YouTube — it’s since reached 6.6 billion — Fortune Magazine published a more-alarming statistic: Streaming the nearly five-minute video that many times required as much computing power “as 40,000 U.S. homes use in a year.”
As more people ramp up their online activity — streaming Netflix shows like “Love Is Blind,” shopping on Amazon, gaming, banking, etc. — the data centers needed to make that happen are multiplying worldwide. And all that computational activity takes energy. These cavernous buildings need significant amounts of electricity to operate and cool their humming servers, the central data repositories for individual devices on a network.
Despite the eye-popping number associated with all those “Despacito” views, researchers say our soaring internet use hasn’t driven an equally huge boom in electricity use, yet. That’s mainly thanks to improvements in energy efficiency.
Globally, demand for data center services rose by 550 percent between 2010 and 2018, a new study found. But the facilities’ energy use grew by only 6 percent in that same period. In the United States, the world’s biggest data center market, energy use actually plateaued over that time period — a sharp departure from the early 2000s, when a doubling of data center output meant a doubling in energy demand.
“There’s been a drastic decoupling in the amount of data center services provided and the energy use,” said Sarah Smith, who coauthored a recent paper in the journal Science. Smith is a senior scientific engineering associate at the Lawrence Berkeley National Laboratory in California.
There are a number of efficiency-related reasons for this decoupling: Companies are increasingly moving servers out of office buildings and into large, shared facilities, which allow for more efficient use of cooling and ventilation systems. Meanwhile, buildings in colder climates like Finland and Sweden use naturally chilled air and water to keep servers from overheating. Servers themselves are improving, with the latest models using far less energy than their power-hungry predecessors.
Smith and her colleagues’ paper suggests that earlier reports have exaggerated the environmental effects of all our binge-watching and cloud computing. While some experts dispute the conclusion, they still agreed that energy efficiency has made huge dents in data centers’ electricity appetite. But everyone who spoke to Grist was united in warning that such measures will only hold back this growing hunger for so long.
Curbing energy use
Inside each data center, thousands of pizza-box-shaped servers are stacked in rows upon rows of racks. An individual server can do the work of about 10 computers, in terms of storing, moving, and analyzing data. Engineers are continuously tweaking server designs so the machines consume less electricity to process data. Since 2010, the amount of electricity use per computation has dropped by a factor of four. While older servers use the same amount of electricity both when they’re active and not in use, newer models use only about a third as much energy when idle.
Data-center operators are likewise installing more energy-efficient equipment to cool and circulate air and to illuminate long, narrow corridors. That’s leading to steady reductions in the “power usage effectiveness” of data centers. That measure compares a data center’s total energy use to the amount of energy needed to run the computing equipment itself. A power usage effectiveness of 2, as Smith explained, means that for every watt used on computing, another is required to do the cooling and everything else in the building. Today’s highly efficient data centers have an index of about 1.1. By contrast, a server room in a typical office building might have an effectiveness of 3 or 4, which is one reason why many businesses are opting for shared — so-called “hyperscale” — facilities.
And it’s not just data centers. Energy efficiency is helping reduce the environmental footprint of other global industries. On land, architects and construction firms are designing buildings to harness more daylight and reduce artificial lighting, as well as using sustainable insulation materials and thicker-paned windows to shrink heating and cooling needs. In 2018, about 250 global architectural firms said they expected to slash the predicted energy use of their new buildings by nearly half, compared to a 2003 baseline. That avoided energy consumption could help prevent 17.7 million metric tons of carbon dioxide emissions — equivalent to taking 4 million passenger cars off the road — and save more than $3.3 billion in operational costs, the American Institute of Architects recently reported.
Meanwhile, at sea, cargo shipping companies are designing vessels to guzzle less fuel. Many newer ships can plug into shoreside electricity supplies to avoid running their massive diesel engines at berth. CMA CGM of France is equipping ships with electronically controlled engines to optimize performance, while China’s COSCO Shipping is outfitting vessels with new propellers and hulls to reduce wave resistance. As a result, according to the Clean Cargo Working Group, emissions from container ships dropped by 37 percent on average (per container, per mile) from 2009 to 2017.
Energy efficiency improvements can make a meaningful difference both for a company’s bottom line and the environment, but they can only do so much and stretch so far before they hit a physical limit.
According to energy experts, efficiency measures should be able to absorb the next doubling of global data center output and keep electricity use steady in the next four more years. Beyond that, it will be harder to stem a surge in power demand without significant changes to computing technology.
“Once almost all computing loads shift to hyperscale facilities, then the benefits of shifting away from really inefficient corporate data centers just run out, and you’ll have to do some other things,” said Jonathan Koomey, a coauthor of the Science paper and a longtime data center researcher.
One way to further limit a facility’s environmental toll is to connect it to renewable energy sources. Many data centers still rely on energy generated by coal- and natural gas-fired power plants. In China, coal supplies about three-fourths of the electricity that the country’s cloud operations consume, according to a study by Greenpeace and Chinese academic institutions. In Virginia, major U.S. tech companies — like Amazon, which is setting up its HQ2 in the state’s Washington, D.C., suburbs — are expanding their presence without adding any additional supplies of wind or solar power. Amazon’s Virginia-based data centers are powered by only 12 percent renewable energy, a separate Greenpeace report said.
“We need to make sure we’re building this digital infrastructure in a way that’s not making the climate change problem worse and taking us in the wrong direction,” said Gary Cook, a former Greenpeace researcher who is now a director for the environmental corporate responsibility watchdog Stand.earth.
Cook said the Science study likely underestimated the amount of electricity that today’s data centers use. He pointed to other reports that found that facilities in the U.S, European Union, and China together consumed around 400 terawatt-hours of electricity annually, nearly 2 percent of global electricity consumption. In the Science paper, researchers said that global data center energy use was about half that amount, or 205 terawatt-hours.
Some of the discrepancy can be explained by researchers’ different approaches to modeling data center activity. Gathering real numbers from actual cloud setups is notoriously difficult, in part because tech companies aren’t willing to share the information. The new report also doesn’t factor in specialized computers used for “mining” cryptocurrencies like Bitcoin or Ethereum – the carbon footprint of which has been hotly debated – and it doesn’t consider artificial intelligence or virtual reality applications, which are also computationally intensive, said Lofti Belkhir, a mechanical engineering professor at McMaster University in Canada.
He said he expects existing efficiency measures will become “tapped out” sooner than other researchers implied. Unless computing technology takes a quantum leap very soon, Belkhir said, “We’re bound for a major uptick in data center energy consumption that will continue to grow exponentially.”
But Koomey and Smith argue that data centers can still do more to boost efficiency before the rising energy demand becomes too much. Scientists are developing “liquid cooling” technologies that place computer chips in direct contact with water or another liquid. By some accounts, this approach could reduce cooling costs by at least 80 percent compared to conventional whirring fans. State and federal governments can adopt energy-efficiency standards or renewable energy requirements to ensure that data centers use only the most advanced technology and cleanest power available. Requiring data centers to report information, even anonymously, would also give researchers more tools for designing and improving their operations.
“We need to make an effort to be prepared for more demand in the future,” Smith explained. “That being said, we can look at what has happened in the industry in the last decade and see it as an inspiration.”
And that’s worth celebrating with a little reggaeton – at least for now.