Beyond Renewables: Emerging Technologies for “Greening” the Data Centre Industry

In the immediate future, liquid cooling could be among the most significant contributors to overall energy savings.

Facebook recently made news with the opening of its data center in Los Lunas, Arizona, USA, which will operate exclusively on renewable energy.[1] This project joins ventures by companies like Italy’s Aruba.it[2] and Australia’s DC Two[3] in prioritizing alternative energy.

As important as renewables are to long-term sustainability, there are additional environmental considerations. From water use to raw materials inputs, data centers are huge consumers of natural resources. They will need to realize greater efficiencies to minimize their impacts as global data volumes and the demand for intensive processing skyrockets.

Fortunately, there are several emerging technologies that can help slim the industry’s ecological footprint in the coming decades.

#1 Liquid hardware cooling

In the immediate future, liquid cooling could be among the most significant contributors to overall energy savings. Various liquid-cooled hardware is already in production. Lenovo, for instance, announced rear-door heat exchangers capable of cutting PUE from the 1.5 to 2 range to about 1.3. And systems using direct-to-chip cooling could bring PUE down to 1.1.[4]

The time is also ripe for full immersion cooling, dunking sealed components in dielectric fluid. This approach eliminates server fans, reduces CPU power consumption, and takes the pressure off data center air conditioning. Estimates for power savings reach as high as 50 percent.[5]

Immersion cooling can also achieve three times the density of traditional Intel processors,[6] which means delivering more computing power on less land. A final benefit, immersion cooling reduces the effects of heat and humidity, helping to extend component lifespans. This means fewer drive replacements and lower overall materials inputs.

#2 AI-driven data center infrastructure management solutions

Artificial technology has a role to play as well, and Google is a leader in this field. The company recently upgraded from a 2014-era machine learning-based “recommendation engine” requiring manual infrastructure adjustments to a proprietary, automated infrastructure management solution. The system evaluates over 120 variables, from fan settings to the dew point, in order to identify inefficiencies and optimize PUE.[7] The technology is shaving 30 percent off the company’s cooling system energy use and savings could reach 40 percent.[8]

#3 Increased server utilization

Speaking of AI, it has a lot to offer for server utilization. Data center operators recognize that zombie servers are a problem, one that’s not going away with virtualization alone. While some 25% of physical servers may be comatose, estimates are that 30% of virtual servers have seen no activity over at least six months.[9]

Arriving to remedy this problem are AI-based load balancers that can detect ghost servers and distribute workloads across available hardware based on server performance, disk utilization, network congestion, and the energy efficiency of each piece of equipment.

IDC predicts that 50 percent of IT assets will have autonomous operation capabilities by 2022.[10] Turning server utilization over to the machines will help overcome human inertia that sometimes prevents manual workload retooling expected to return only incremental efficiency gains.

#4 Quantum computing

Further into the future is the promise of quantum computing. A study by Oak Ridge National Laboratory, a U.S. Department of Energy facility, found that quantum computers could reduce energy needs by more than 20 orders of magnitude over their conventional counterparts.[11]

The prospect has a lot of money flowing in. The government of British Columbia, Canada, for example, has spent millions with D-Wave, backing its promises to address climate change.[12] Venture capital investment has reached $250 million per year and some governments—the EU, China, and the USA—are sinking billions into R&D.[13]

Some experts worry that the current status of quantum computing is overhyped. True, today’s quantum computers remain prone to errors that are difficult to correct. The need for an entirely different programming approach is another core barrier.

Nonetheless, IBM has unveiled a commercial quantum computer, the Q System One.[14] IBM is also among the companies making their quantum computers and accompanying programming kits publicly available online.

Unfortunately, these systems are of little practical value yes, and quantum computers won’t simply replace laptops anytime soon.[15] For near-term applications, quantum computers will most likely serve as “accelerators” within major providers’ clouds. The goal would be to identify when workloads can benefit from a quantum system’s computational fortes and tap those resources on a targeted basis.[16]

The true breakout of quantum computing is likely more than a decade off,[17] but we still look forward its potential once the field has matured.

#5 Increasing dominance of hyperscale providers

Hyperscale providers are in a position to leap to these types of efficiency technologies, and the outsize financial impact of the sometimes marginal savings can provide them the incentive to do so.

It’s hard to imagine interests with less of a budget than Microsoft, for example, exploring how to site data centers under the ocean to benefit from the naturally cool temperatures.[18] Similarly, Google’s own AI-based DCIM cannot be deployed without deep learning at a particular site, so an equivalent commercial system capable of rolling out to more diverse data centers is still many steps away.

There are certainly reasons to worry about market consolidation, but as an increasing share of data storage, processing and network traffic moves to hyperscale providers—with their share expected to reach 57%, 69% and 53% respectively by 2020[19]—the world is shifting toward those companies that can best leverage emerging technologies to enhance sustainability.