Tag Archive for Renewable energy

Under Water Data Center Resurfaces

Under Water Data Center Resurfaces– Updated – 07/07/2024 – Microsoft has discontinued its efforts to build a data center on  the sea floor. “I’m not building subsea data centers anywhere in the world,” Noelle Walsh, the head of Microsoft’s Cloud Operations and Innovation division, told DatacenterDynamics.

Two years ago, Microsoft sank a data center half a mile off Scotland’s Orkney Islands under 117 feet of North Sea water. Earlier this week, they dredged the shipping container-size data center of 864 servers and 27.6 petabytes of storage back to the surface. Now that it has resurfacedMicrosoft (MSFT) researchers are studying how it survived its trip into Davy Jone’s locker and the trip can tell us about land-loving data centers.

Lower failure rate

Microsoft logoTheir first conclusion is that the cylinder with servers packed in like sardines had a lower failure rate than a conventional data center. Only eight out of the 855 servers on board had failed. Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick, said in a presser,

Our failure rate in the water is one-eighth of what we see on land.

The MSFT team is speculating that the greater reliability may be connected to the fact that there were no humans on board.  Microsoft’s John Roach explained:

people bump and jostle components,The team hypothesizes that the atmosphere of nitrogen, which is less corrosive than oxygen, and the absence of people to bump and jostle components, are the primary reasons for the difference. If the analysis proves this correct, the team may be able to translate the findings to land data centers.”They believe that land-loving data centers often run into issues like corrosion from oxygen, humidity and temperature fluctuations. and bumps and jostles from people who replace broken components.

Microsoft "Northern Isles"

Alternate power sources for data centers

Project Natick is also about addressing the huge energy demands of data centers as more and more of our data is stored in the cloud. All of Orkney’s electricity comes from alternate power sources, wind and solar power, which was not a problem for the underwater data center “Northern Isles.” Spencer Fowers, Microsoft’s Special Projects research group principal member of technical staff,

We have been able to run really well on what most land-based data centers consider an unreliable grid.

Not only can data centers run on alternative power, but they may not need the huge investment in dedicated buildings, rooms of batteries, and racks of UPS’s. Microsoft’s Fowers speculates;

We are hopeful that we can look at our findings and say maybe we don’t need to have quite as much infrastructure focused on power and reliability.

Underwater data center availability

Microsoft has clammed up about the availability of an underwater data center SKU, but MSFT’s Cutler is confident that it has proved the idea has value;

We think that we’re past the point where this is a science experiment … Now it’s simply a question of what do we want to engineer – would it be a little one, or would it be a large one?

rb-

The drive to autonomous vehicles is just one case that explains MSFT’s idea of micro-self-contained data centers vs. mega-data centers. Even with 5G –  computing power will have to move closer to the user, to the edge of the network. How much latency do you want as your autonomous Tesla, traveling 70 MPH tries to figure out where it is?

Stay safe out there!

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Undersea Data Center

Updated 08/09/2019 – Microsoft has installed two underwater cameras that offer live video feeds of the sunken data center. You can now watch all kinds of sea creatures swimming around a tank that holds 27.6 petabytes of data.

Undersea Data CenterFollowers of the Bach Seat know that Microsoft (MSFT) has experimented with undersea data centers to save costs associated with deploying data centers. Back in 2015, I wrote about MSFT’s initial experiment off the California coast where MSFT first tried out the idea of an underwater data center. Redmond has announced phase 2 of Project Natick. Phase 2 of Project Natick is designed to test the practical aspects of deploying a full-scale lights-out data center underwater called, “Northern Isles.”

Undersea Data CenterKurt Mackie wrote in an article at Redmond Magazine that Microsoft is testing this underwater data center off the coast of Scotland near the Orkney Islands in the North Sea. Microsoft wants to place data centers offshore because about half the world’s population lives within 125 miles of a coast. Locating data closer to its users reduces latency for bandwidth-intensive applications such as video streaming and gaming, as well as emerging artificial intelligence-powered apps. Latency is the time it takes data to travel from its source to customers. It is like the difference between using an application on your hard drive vs. using off the network.

Mr. Mackle posts that the original underwater data center had the computing power of 300 PCs, Phase 2’s computing power is equal to “several thousand high-end consumer PCs,” according to Microsoft’s FAQ page. This next-generation underwater data center requires 240KW of power, is 40 feet in length, and holds 12 racks with 864 servers. The submarine container is mounted on a metal platform on the seafloor 117 feet deep. The Phase 2 data center can house 27.6 petabytes of data. A fiber-optic cable keeps it connected to the outside world. Naval Group, a 400-year old French company built the submarine part of the project.

The interesting part (U.S. Navy submarines have had computers onboard for years) is the lights-out part. Lights out allow Microsoft to change up how data centers are deployed. Northern Isles’s cooling techniques are changed. The cold-aisle temperature is kept at a chilly 54F (12C) to remove the stress temperature variations place on components. This temperature is maintained by using a heat-exchange process developed for cooling submarines. Ben Cutler, Microsoft Research Project Natick lead told Data Center Knowledge, “... by deploying in the water we benefit from ready access to cooling – reducing the requirement for energy for cooling by up to 95%.”

heat exchangerWith Phase 2, Mr. Cutler explained to DCK there’s no external heat exchanger, “We’re pulling raw seawater in through the heat exchangers in the back of the rack and back out again.” This cooling system could cope with very high power densities, such as the ones required by GPU-packed servers used for heavy-duty high-performance computing and AI workloads.

According to DCK the first iteration of Project Natick had a Power Usage Effectiveness (PUE) rating of 1.07 (compared to 1.125 for Microsoft’s latest-generation data centers). The lower the PUE metric, the more efficiently the data center uses electricity. Microsoft hopes to improve the PUE for the phase 2 data center.

off-the-grid tidal power.Data centers are believed to consume up to 3% of the world’s electricity. The new cooling options change up the Northern Isles data center power requirements. It can run off the Orkney Islands’ local electrical grid which is powered by renewable wind, solar and tidal sources. One of the goals of the project is to test powering the data center with an off-the-grid source, such as using nearby tidal power.

Future versions of the underwater data center could also have their own power generation. Mr. Cutler told DCK, “Tide is a reliable, predictable sort of a thing; we know when it’s going to happen … Imagine we have tidal energy, we have battery storage, so you can get a smooth roll across the full 24-hour cycle and the whole lunar cycle.”

This would allow Microsoft to do away with backup generators and rooms full of batteries. They could over-provision the tidal generation capacity to ensure reliability (13 tidal turbines instead of 10, for example). Mr. Cutler says, “You end up with a simpler system that’s purely renewable and has the smallest footprint possible.”

 Northern Isle underwater data centerThe Northern Isle underwater data center is designed to run without being staffed. This fact cuts down on human errors. It is designed with a “fail-in place” approach where failed components are not serviced, they are just left in place. Operations are monitored by artificial intelligence. Mr. Cutler said, “There’s a lot of data showing that when people fix things they’re also likely to cause some other problem.

By operating in ‘lights out’ node with no human presence, allows most of the oxygen and water vapor to be removed from Northern Isles’ atmosphere. MSFT replaced Oxygen with 100% dry nitrogen. This environment should greatly cut the amount of corrosion in the equipment, a major problem in data centers on land.  Mr. Cutler told DCK, “With the nitrogen atmosphere, the lack of oxygen, and the removal of some of the moisture is to get us to a better place with corrosion, so the problems with connectors and the like we think should be less.

The Redmond Magazine article says Project Natick’s phase 2 has already proved that it’s possible to deploy an underwater data center in less than 90 days “from the factory to operation.” The logistics of building underwater data centers are very different from building data centers on land. Northern Isles was manufactured via a standardized supply chain, not as a construction process.  Mr. Cutler said, “Instead of a construction project, it’s a manufactured item; it’s manufactured in a factory just like the computers we put inside it, and now we use the standard logistical supply chain to ship those anywhere.

standard ISO shipping containerThe data center is more standardized. It was purposely built to the size of a standard ISO shipping container. It can be shipped by truck, train or ship. Naval Group shipped Northern Isles to Scotland on a flatbed truck. Mr. Cutler told DCK, “We think the structure is potentially simpler and more uniform than we have for data centers today … the expectation is there actually may be a cost advantage to this.”

The rapid time to deploy these data centers doesn’t only mean expanding faster, it also means spending fewer capital funds. Mr. Cutler explained, “It takes us in some cases 18 months or two years to build new data centers … Imagine if instead … where I can rapidly get them anywhere in 90 days. Well, now my cost of capital is very different … As long as we’re in this mode where we have exponential growth of web services and consequently data centers, that’s enormous leverage.

rb-

If Project Natick stays on the same trajectory, MSFT could bring data centers to any place in the developed or developing world without adding more stress on local infrastructure. MSFT’s Cutler told DCK “There’s no pressure on the electric grid, no pressure on the water supply, but we bring the cloud.”

As more of the world’s population comes online, the need for data centers is going to skyrocket, and having a fast, green solution like this would prove remarkably useful.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Is Your Data Center Underwater?

Is Your Data Underwater?Every time you like something on Facebook, it causes a computer in a cloud data center somewhere in the world to do something. That computer uses electricity to let the world know you like the sleepy puppy video or what you dinner looked like.

computers produce heatAs you may have noticed if you left your laptop on your lap for too long computers also produce heat. Facebook (FB), Twitter (TWTR), Instagram, and all the other time-wasters have millions of computers generating excess heat that needs to go somewhere. It is estimated that Facebook alone has hundreds of thousands of servers.

Keep servers cool

One of the ways to keep servers cool is to keep them wet. As count-intuitive as that seems, there are companies that use liquid immersion to cool their servers according to the Register. This approach uses data centers featuring large ‘baths’ filled with a dielectric liquid into which racks of equipment are submerged.

Green Revolution Cooling CarnotJetMineral oil has been used in immersion cooling before Perhaps the best-known proponent of liquid immersion cooling is Green Revolution Cooling. Its CarnotJet system allows rack-mounted servers from any OEM to be dunked in special racked baths filled with a dielectric mineral oil blend called ElectroSafe (PDF), an electrical insulator it claims to have 1,200 times more heat capacity by volume than air.

Green Revolution Cooling claims cooling energy reductions of up to 95 percent, server power savings of 10-25%, data center build-out cost reductions of up to 60% through simplified architecture, and improved server performance and reliability as a result of less exposure to dust (and moisture).
Microsoft has taken this technology to the next level. Now, Microsoft is experimenting with locating entire data centers underwater.

Microsoft underwater data center

Microsoft logoComputerWorld is reporting that Microsoft has designed, built, and deployed its own sub-sea data center in the ocean, in the period of about a year. The Redmond, WA firm started working on the project in late 2014. Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.

The eight-foot diameter steel prototype vessel, named after the Halo character Leona Philpot, operated 30 feet underwater on the Pacific Ocean seafloor, about 1 kilometer off the California coast near San Luis Obispo for 105 days from August to November 2015, according to Microsoft. Microsoft engineers remotely controlled the data center and even ran commercial data-processing projects from Microsoft’s Azure cloud computing service in the submerged data center.

Project NatickThe sub-sea data center experiment, called Project Natick after a town in MA, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers. Microsoft says,

Project Natick reflects Microsoft’s ongoing quest for cloud data center solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

Microsoft believes that using undersea data centers can serve the 50% of people who live within 200 kilometers of the ocean. They say that deployment in deep-water offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.

Microsoft is weighing coupling the data center with a turbine or a tidal energy system to generate electricity, according to the New York Times.

Environmental impact

A new trial is expected to begin next year, possibly near Florida or in Northern Europe, Microsoft engineers told the NYT.

environmental impactSome users questioned whether an undersea data center could have an environmental impact, including the heating up of the water around the data center. But Microsoft claimed on its website that the project envisages the use of data centers that would be totally recycled and would also have zero emissions when located along with offshore renewable energy sources. MSFT told Computerworld

No waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment … During our deployment of the Leona Philpot vessel, sea life in the local vicinity quickly adapted to the presence of the vessel.

rb-

I have covered some other alternative ways to deal with data centers on Bach Seat, including HP’s plans to use cow manure to generate electricity and Microsoft’s plan to use sewer gas to power a data center in Wyoming.

Underwater data centers are an attractive idea, there are challenges. One is a concern is the saltwater could corrode the structures. This issue can be resolved by locating the data centers in the freshwater Great Lakes. The Great Lakes basin is projected to reach a population of about 65 million by 2025.

The region includes:

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.