Driving Green Data Centers Beyond Renewables

The great potential of quantum computing—admittedly, its cybersecurity and defense implications in addition to its environmental ones—has governments, such as the EU, China, and the US, pouring billions into research and development each year.

Data centers are undeniably power hungry, consuming over 2 percent of the world’s electricity and producing CO2 emissions in quantities rivaling airlines.[1] Due to exploding data volumes, consumer electronics, IoT and edge, and the demand for high-end analytics and machine learning, the information and communications technologies (ICT) sector could use 20% of global electricity and generate 5.5% of all carbon emissions by 2025, according to an updated 2016 study.[2]

Thus the hunt is on for ways to trim the industry’s ecological footprint. Renewable energies are gaining adherents and headlines, but are they enough to transform data centers into sustainable operations?

Renewables Are Only Part of the Solution

Adoption of renewables is on an upswing. For example, Facebook’s widely covered Los Lunas data center in Arizona, USA—expected to be completed in 2023—will tap an 80MW solar installation to power its nearly 3 million square feet.[3] And already, companies like Italy’s Aruba.it are combining photovoltaic with hydroelectric power to meet 100% renewable aspirations,[4] while Iceland has become a hotbed of data center construction due to its geothermal-driven energy options.[5][6]

Despite the attention, renewables only fuel about 20% of data center energy needs.[7] Even with greater uptake, they won’t be enough to achieve true sustainability.[8] Not only do questions remain about how fast a complete transition to non-polluting power is possible, even wind turbines have environmental impacts, such as from the mining of neodymium.

Data centers are also large consumers of the world’s freshwater, this at a time when 2.7 billion people suffer from water scarcity.[9] And physical hardware, such as servers, relies on raw materials, from rare earth metals to polyvinyl chlorides. While good stewardship will help protect soil, water, air, and our food chain from contamination, constant expansion in demand for equipment will exacerbate data centers’ impacts.

Long-term, data centers must think bigger and bolder to identify, integrate, and optimize technologies to enable the delivery of far greater compute power, storage capacity, and network connectivity with fewer inputs of energy, water, and other resources. Where do our best hopes lie?

A Quantum of Solace?

Google burns over 8 gigawatts-hours of energy per year[10] to provide its search, maps, digital assistant, fitness tracking, and other products. It’s an amount equivalent to the generation capacity of over 25 million photovoltaic panels or 3,500 utility-scale wind turbines.[11]

Now image slashing that energy use by 20 orders of magnitude—to a fraction of a nanowatt-hour.

That’s what’s possible with quantum computing, according to a simulation by Oak Ridge National Laboratory, a U.S. Department of Energy facility. All told, they found quantum technologies could deliver a savings of over one million kilowatt-hours.[12]

Such radical reduction on the compute side of the data center would slash the power-generation capabilities required. Additionally, D-Wave, a player in the quantum space, highlights that quantum computers run cold and deliver HVAC-related savings as well.[13]

The great potential of quantum computing—admittedly, its cybersecurity and defense implications in addition to its environmental ones—has governments, such as the EU, China, and the US, pouring billions into research and development each year. [14] Venture capital investment is ramping up as well.

Unfortunately, true breakout is likely more than a decade off, and the hurdles remain high.[15] Simple vibrations can mix up quantum calculations, and error correction is far from straightforward. An entirely different programming approach is also required. Much-hyped releases, such as IBM’s Q System One,[16] may have current applications in scientific research and high-end modelling, but they won’t replace everyday laptops anytime soon. [17]

This means that in the immediate future, quantum computers are most likely to take their place in the cloud, where early-stage kits have already been made available. With further development, quantum computers could soon serve as “accelerators.” In such a scenario, cloud providers would identify workloads befitting quantum’s fortes and tap those resources accordingly.[18] This could enable far more detailed weather forecasting, for instance, but it won’t take the load of data center resource inputs in the near term.

Must We Go Hyperscale?

Hyperscale providers may be best positioned to make the leap to cutting-edge efficiency technologies, be they quantum computing or other approaches currently being trialed. After all, it would be difficult for the average, regional colocation provider to support the research budget to site prototype data centers a hundred meters under the ocean, as Microsoft is doing.[19]

There are other technologies that, for now, seem custom-made for hyperscale as well. For example, Google has been integrating artificial intelligence into data center infrastructure management (DCIM). The company has deployed an automated, machine-learning based system, which tracks over 120 data center variables and examines multiple aspects of current weather conditions to remedy inefficiencies and lower PUE. [20] The system is currently able to shave about 30 percent off cooling system energy use and could soon reach 40 percent savings.[21]

The problem is that Google’s DCIM cannot be deployed to its own highly uniform locations without significant customization and deep learning time. The arrival of an equivalently robust system that could be rolled out to the diverse array of facilities comprising the data center industry is still some ways off.

This might indicate an upside to today’s consolidation trends, which will place 57% of data storage, 69% of data processing, and 52% of network traffic in the hands of a few hyperscale providers by 2020, according to Cisco. [22] By shifting its workloads, the world will be shifting its investment to those data centers perhaps most capable of leveraging advanced technologies to increase sustainability.

Options for the Rest of Us

The problem with a hyperscale-dominated solution is that it doesn’t fully reflect the status of the industry today, with its hybrid infrastructure and continued on-premises operations. And it could run headlong into edge computing. The data center industry is currently investing in second- and third-tier cities, intending to use regional facilities to provide lower latency appropriate for many IoT applications.[23] From there, it’s far from clear how the move to the edge will shake out.

Fortunately, there are technologies likely to deliver substantial impact for the average enterprise data center, colocation provider, or even regional office. For example, we needn’t write off AI for years. Smaller scale providers may soon use AI-based technologies to kill off their zombie servers—the 25% of physical servers and 30% of virtual servers that have demonstrated no activity for at least six months.[24]

Advanced load balancers are showing promise for detecting ghost servers and distributing workloads across available hardware based on various factors, such as server performance, disk utilization, network congestion, and the energy efficiency of each piece of equipment. In fact, IDC predicts that 50 percent of IT assets will have autonomous operation capabilities by 2022.[25]

Another interesting technology is liquid cooling, first used in the mainframes of yore. Although many companies will lack the funds to reengineer entire facilities along the lines Facebook has drawn up, numerous hardware manufacturers are offering relatively easy-to-integrate, liquid-cooled products.

Lenovo has announced rear-door heat exchangers capable of reducing PUE from as much as 2 to about 1.3, and hardware with direct-to-chip cooling could bring PUE down as far as 1.1.[26] Full immersion cooling is also of interest. It eliminates the need for server fans and reduces CPU power consumption, thereby taking the pressure off data center HVAC systems. Estimates for power savings reach as high as 50 percent.[27]

With increasing density being driven by data analytics, machine learning, and other high-demand technologies, a transition back to liquid cooling seems inevitable, at least for certain applications. Given liquid’s advantages over an air-cooled paradigm, will it be enough to tip the balance toward sustainability?

Sadly, no. The fact is, the ICT sector is awash in compelling projects and high-impact emerging technologies, but none alone can solve the big problem of putting ever-more technology in the hands of consumers and businesses while shrinking resource consumption and carbon emissions.

While the planet waits for a moonshot, whether of a quantum or other variety, we’ll have to make do with incremental improvements supplied by using multiple solutions available today to mitigate our footprint while we keep pace with demand.