The Future of the Data Center Written in the Stars

Rather than resorting to air cargo, IT organizations will identify ways to avoid transmitting large quantities of raw data by moving compute and storage toward data sources. This is a primary driver of edge computing.

Seeing the stars may soon become more difficult. That’s the warning from some astronomers, who claim the solar panel reflections from hundreds of SpaceX Starlink satellites, intended to supply global internet, could fundamentally change the view of the night sky.[1] It’s just one connection between what’s happening overhead and life here on earth. From black hole imagery to orbiting supercomputers, space-focused technology has a lot to say about the data center.

Computing at the Speed of Airplanes

The first-ever picture of a black hole, released back in April, stunned the world. Among technologists, there were those who pondered the feat of tapping eight terrestrial observation stations to achieve the requisite resolution to see 55 million light years, into the Messier 87 galaxy. Others were captivated by the more relatable photograph of computer scientist Katherine Bouman posing with dozens of hard drives holding data that made the one iconic image possible.[2]

It turns out, the geographically disparate observatories sent their data to the central compilation center in an unexpected way—via FedEx. It’s a fact that flies in the face of IT pros’ favored mental imagery, Tron-like beams of light-speed transmission across mobile spectrum and fiber optic cable.

The researchers’ reasons, however, were practical. Over standard internet, sending 5 petabytes of data—approximately 5,000 years’ worth of MP3 play time[3]—would take years. Shipping disks took mere hours, achieving speeds of about 14 gigabytes per second.[4]

Interestingly, this challenge facing the Event Horizon Telescope has much in common with the ones data centers are battling. Data creation on earth will soon grow to 175 zetabytes[5][6] of mostly device-generated, frequently real-time information.[7] The sheer volume threatens to overwhelm even next generation networks. Like Bouman, we’ll have to find alternatives.

Rather than resorting to air cargo, IT organizations will identify ways to avoid transmitting large quantities of raw data by moving compute and storage toward data sources. This is a primary driver of edge computing.

In the forthcoming, widely distributed ecosystem, compute and storage loci will proliferate. IoT devices will get smarter to provide initial data processing. On-premises data centers will gain new importance when located alongside heavy data-generating operations, and micro-data centers will be established where these legacy resources don’t exist. Vendors will offer 5G-based edge computing packages, with data center pods based at cell towers and telecom offices, and more local and regional cloud and colocation options will be developed.

The myriad data collected will increase in value as it is processed up the chain, culminating in the information our AI systems will plumb in centralized, often hyperscale, potentially quantum-accelerated,[8] data centers. From a knowledge perspective, the results generated may be as startling as a picture of a black hole.

Armoring Up with Software

Many light years closer than the Messier 87 galaxy lies the International Space Station, in near-earth orbit. There, Hewlett-Packard Enterprise is pursuing an experiment that could one day expand data processing capabilities at a different edge—in space itself.

The topic of investigation, the seemingly oxymoronic potential of software-based hardening. It’s a proposed alternative to the time-consuming, expensive hardware-based processes by which space-bound IT gear is currently prepared to withstand ionizing radiation and solar flares.

Akin to building an extra-tough Toughbook, hardening today involves shielding sensitive components from the elements of space. The limitations have so far prevented true supercomputing power from being sent into orbit or on missions traveling our solar system and beyond.

HPE scientists hypothesized that throttling a supercomputer when hazardous conditions arise could extend the lifespan of off-the-shelf hardware sited in such a menacing environment.[9] To explore the possibility, they sent a Linux system to the ISS in 2017, hoping for a year of error-free operations. Over 530 days later and counting, the machine is still humming along nicely.[10]

The ability to cost-efficiently pack more compute into spacecraft would be transformative. For exploration purposes, such as a future Mars mission, this could mean onboard “edge-like” processing of sensor data to streamline transmission to earth. Software-based systems also offer upgradeability. Deep space probes can take decades to reach their destinations. As hardening technologies develop, periodic code updates could be sent to help boost the resilience of onboard supercomputers.

What’s more, those hoping to build space-based data centers circling our own planet find themselves one step closer. This futuristic concept would leverage the advantages above, including plentiful solar energy, zero gravity, and a low-humidity, low-dust environment without hurricanes and other terrestrial weather extremes to worry about. Operating at -100o Celsius in the shade, but suffering high latency, space systems could provide the ultimate cold storage.[11][12]

Already the cost per gigabyte of launching storage capacity into space is plunging, due to both technology miniaturization and the cost-efficiencies of the private space travel market. Readily available, AI-based remote monitoring systems are also improving in their ability to proactively detect, automatically diagnose, and even fix equipment faults at a distance. This capability has sent enterprise uptime through the roof and could help deliver the long-term performance necessary in an inaccessible data center in orbit.[13] The possibility of using commercial hardware in the extremes of space checks off another key ingredient for space-based computing.

As with other advances driven by space exploration, from high-end optics technologies to Tang, software-based hardening could eventually return home to fortify the technologies we earthlings rely on day to day. Whether this prediction proves true, it is clear that more space-born concepts will inevitably take off here in the coming decades, so anyone interested in where the data center may be headed should do one thing—look up.