Undersea Data Center

Updated 08/09/2019 – Microsoft has installed two underwater cameras that offer live video feeds of the sunken data center. You can now watch all kinds of sea creatures swimming around a tank that holds 27.6 petabytes of data.

Undersea Data CenterFollowers of the Bach Seat know that Microsoft (MSFT) has experimented with undersea data centers to save costs associated with deploying data centers. Back in 2015, I wrote about MSFT’s initial experiment off the California coast where MSFT first tried out the idea of an underwater data center. Redmond has announced phase 2 of Project Natick. Phase 2 of Project Natick is designed to test the practical aspects of deploying a full-scale lights-out data center underwater called, “Northern Isles.”

Undersea Data CenterKurt Mackie wrote in an article at Redmond Magazine that Microsoft is testing this underwater data center off the coast of Scotland near the Orkney Islands in the North Sea. Microsoft wants to place data centers offshore because about half the world’s population lives within 125 miles of a coast. Locating data closer to its users reduces latency for bandwidth-intensive applications such as video streaming and gaming, as well as emerging artificial intelligence-powered apps. Latency is the time it takes data to travel from its source to customers. It is like the difference between using an application on your hard drive vs. using off the network.

Mr. Mackle posts that the original underwater data center had the computing power of 300 PCs, Phase 2’s computing power is equal to “several thousand high-end consumer PCs,” according to Microsoft’s FAQ page. This next-generation underwater data center requires 240KW of power, is 40 feet in length, and holds 12 racks with 864 servers. The submarine container is mounted on a metal platform on the seafloor 117 feet deep. The Phase 2 data center can house 27.6 petabytes of data. A fiber-optic cable keeps it connected to the outside world. Naval Group, a 400-year old French company built the submarine part of the project.

The interesting part (U.S. Navy submarines have had computers onboard for years) is the lights-out part. Lights out allow Microsoft to change up how data centers are deployed. Northern Isles’s cooling techniques are changed. The cold-aisle temperature is kept at a chilly 54F (12C) to remove the stress temperature variations place on components. This temperature is maintained by using a heat-exchange process developed for cooling submarines. Ben Cutler, Microsoft Research Project Natick lead told Data Center Knowledge, “... by deploying in the water we benefit from ready access to cooling – reducing the requirement for energy for cooling by up to 95%.”

heat exchangerWith Phase 2, Mr. Cutler explained to DCK there’s no external heat exchanger, “We’re pulling raw seawater in through the heat exchangers in the back of the rack and back out again.” This cooling system could cope with very high power densities, such as the ones required by GPU-packed servers used for heavy-duty high-performance computing and AI workloads.

According to DCK the first iteration of Project Natick had a Power Usage Effectiveness (PUE) rating of 1.07 (compared to 1.125 for Microsoft’s latest-generation data centers). The lower the PUE metric, the more efficiently the data center uses electricity. Microsoft hopes to improve the PUE for the phase 2 data center.

off-the-grid tidal power.Data centers are believed to consume up to 3% of the world’s electricity. The new cooling options change up the Northern Isles data center power requirements. It can run off the Orkney Islands’ local electrical grid which is powered by renewable wind, solar and tidal sources. One of the goals of the project is to test powering the data center with an off-the-grid source, such as using nearby tidal power.

Future versions of the underwater data center could also have their own power generation. Mr. Cutler told DCK, “Tide is a reliable, predictable sort of a thing; we know when it’s going to happen … Imagine we have tidal energy, we have battery storage, so you can get a smooth roll across the full 24-hour cycle and the whole lunar cycle.”

This would allow Microsoft to do away with backup generators and rooms full of batteries. They could over-provision the tidal generation capacity to ensure reliability (13 tidal turbines instead of 10, for example). Mr. Cutler says, “You end up with a simpler system that’s purely renewable and has the smallest footprint possible.”

 Northern Isle underwater data centerThe Northern Isle underwater data center is designed to run without being staffed. This fact cuts down on human errors. It is designed with a “fail-in place” approach where failed components are not serviced, they are just left in place. Operations are monitored by artificial intelligence. Mr. Cutler said, “There’s a lot of data showing that when people fix things they’re also likely to cause some other problem.

By operating in ‘lights out’ node with no human presence, allows most of the oxygen and water vapor to be removed from Northern Isles’ atmosphere. MSFT replaced Oxygen with 100% dry nitrogen. This environment should greatly cut the amount of corrosion in the equipment, a major problem in data centers on land.  Mr. Cutler told DCK, “With the nitrogen atmosphere, the lack of oxygen, and the removal of some of the moisture is to get us to a better place with corrosion, so the problems with connectors and the like we think should be less.

The Redmond Magazine article says Project Natick’s phase 2 has already proved that it’s possible to deploy an underwater data center in less than 90 days “from the factory to operation.” The logistics of building underwater data centers are very different from building data centers on land. Northern Isles was manufactured via a standardized supply chain, not as a construction process.  Mr. Cutler said, “Instead of a construction project, it’s a manufactured item; it’s manufactured in a factory just like the computers we put inside it, and now we use the standard logistical supply chain to ship those anywhere.

standard ISO shipping containerThe data center is more standardized. It was purposely built to the size of a standard ISO shipping container. It can be shipped by truck, train or ship. Naval Group shipped Northern Isles to Scotland on a flatbed truck. Mr. Cutler told DCK, “We think the structure is potentially simpler and more uniform than we have for data centers today … the expectation is there actually may be a cost advantage to this.”

The rapid time to deploy these data centers doesn’t only mean expanding faster, it also means spending fewer capital funds. Mr. Cutler explained, “It takes us in some cases 18 months or two years to build new data centers … Imagine if instead … where I can rapidly get them anywhere in 90 days. Well, now my cost of capital is very different … As long as we’re in this mode where we have exponential growth of web services and consequently data centers, that’s enormous leverage.

rb-

If Project Natick stays on the same trajectory, MSFT could bring data centers to any place in the developed or developing world without adding more stress on local infrastructure. MSFT’s Cutler told DCK “There’s no pressure on the electric grid, no pressure on the water supply, but we bring the cloud.”

As more of the world’s population comes online, the need for data centers is going to skyrocket, and having a fast, green solution like this would prove remarkably useful.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Comments are closed.