20,000 Gigs Under the Sea?

A novel data center design

Business-Energy-Editor-Laura-Sanchez

The Microsoft Corporation tends to dive deep when it comes to innovative product development. The company’s most recent venture—Project Natick—carries that principle to the extreme in an experimental data center that stores computer servers below the ocean’s surface.

The prototype—which the team named Leona Philpot after a Halo video game character—is a 38,000-pound, 10-by-7-foot steel container filled with a single data center computer rack that is bathed in pressurized nitrogen to remove heat from computing chips.

Many communities are considering, researching, or implementing microgrid solutions. The underlying rationale often involves complex business, operational, and economic issues. See our FREE Special Report: Understanding Microgrids. Download it now!

The vessel was submerged 30 feet underwater off the coast of San Luis Obispo, California, for 105 days. The research team monitored it with cameras, as well as temperature, humidity, acoustic, and electrical current sensors, from their offices at Microsoft’s Redmond, Washington, campus. A diver was also sent down once a month to check on the capsule. The project was considered such a success that a second submersible, with increased data storage capacity, is now in the works.

Some benefits of this submerged data center are more obvious than others. Cooling, for example, is facilitated by the abundance of ocean water. While traditional facilities rely on HVAC systems that use large amounts of energy and water to keep servers at functionable temperatures, Microsoft hopes to make the cooling process more efficient by placing data centers in steel containers under ocean water.

The data center’s design may eventually enable it to harness the power of waves. Although the prototype pod was powered by an on-shore energy source, the team plans to employ hydrokinetic energy to make submarine data centers completely self-sustaining. They will be produced to last about 20 years with maintenance rotations every five years.

Microsoft reportedly manages more than 100 data centers around the world and is adding more to their portfolio at a rapid rate. The New York Times reports that the company has spent more than $15 billion dollars on the global data center system that currently supports more than 200 online services. It is seeking cost-effective, manufacturable alternatives to brick-and-mortar building development.

Rather than waiting on costly and time-consuming aspects of construction projects such as permitting, Microsoft spokesperson Athima Chansanchai explained that the company will be able to manufacture capsules and deploy them anywhere in the world within 90 days.

Data center location is key and finding an appropriate and affordable one can be a complicated issue for developers. Not only do buildings take up a lot of space, they use a lot of resources. Microsoft’s underwater data center avoids land limitations entirely and makes use of ample underwater real estate.

Furthermore, a data center’s location relative to its customers determines the speed of its data transmission. And, as Microsoft project manager Ben Cutler points out, about half of the world’s population lives within 120 miles of the sea. Engineers feel that placing data centers offshore will reduce latency, making downloads, storage retrieval, and Web browsing much faster.

We talk a lot about data centers these days—about keeping them cool and powering them, as well as maximizing their efficiency and security. It seems to me that Microsoft has addressed all of these issues except for security.

What do you think about this new innovation? Do you think that the ocean is a safe place to keep critical information? By removing the human element, is the company reducing or increasing its vulnerability to physical threat or cyber attack? BE_bug_web

Comments

Leave a Reply

Enter Your Log In Credentials
×