Tag Archive for Green

Will Climate Change Sink the Web?

Despite claims to the contrary, climate change is real. Climate change will break critical parts of the Internet within 20 years. That is what a study by Paul Barford, a University of Wisconsin, Madison professor of computer science predicts.

Professor Barford presented his findings at IETF 102. IETF 102 was a meeting of the Internet Engineering Task Force Association for Computing Machinery, the Internet Society, and the Institute of Electrical and Electronics Engineers in Montreal. The study, “Lights Out: Climate Change Risk to Internet Infrastructure,” found that critical communications infrastructure could be submerged by rising seas in as soon as 15 years.

Conventional copper and fiber optic cables

Companies like Google, Microsoft, Facebook, and Cable and Wireless go through enormous costs and efforts to protect undersea cable spanning the continents but once that cable hits the shore it gets converted to conventional cables. The conventional copper and fiber optic cables buried decades ago, carry the signals from the landing points to the interior are not designed to withstand the inundation by saltwater caused by climate change.

Internet landing points that will be impacted by climate change

Popular Science reports that Professor Barford’s research found that climate change will impact more than 4,000 miles of buried fiber optic conduit. These conduits and internet cables will most likely be underwater and become inoperable due to exposure to damaging saltwater. Saltwater causes damage to the cables which reduces their ability to send signals. The cable landing stations where undersea cables connect the U.S. Internet to the rest of the world will also be vulnerable. The study also predicts that water will surround over 1,100 traffic hubs.

Unsersea fiber optic cable landing point susceptible to flooding

Major interruptions

Mr. Barford told Popular Science that this service interruption is likely to become a growing problem within the next 15 years. He warned that communications companies should begin implementing protective measures and solutions soon if they want to avoid major interruptions in the near future.

“Most of the damage that’s going to be done in the next 100 years will be done sooner than later,” says Dr. Barford, the keeper of the Internet Atlas, a comprehensive repository of the physical Internet — the buried fiber optic cables, data centers, traffic exchanges and termination points that are the nerve centers, arteries, and hubs of the vast global information network. “That surprised us. The expectation was that we’d have 50 years to plan for it. We don’t have 50 years.” He also notes “The landing points are all going to be underwater in a short period of time.”

The study is the first risk assessment of the impact of climate change on the U.S. infrastructure of the Internet. It reports that Miami, New York, and Seattle are among the areas where connectivity could be most affected. The Internet in these cities is at risk because cables carrying it tend to converge on a few fiber optic strands that lead to large population centers.

Fiber optic cable conduit susceptible to floodingBut the effects of climate changes would not be confined to those areas and would ripple across the Internet, potentially disrupting global communications. Many of the conduits at risk are already close to sea level and only a slight rise in ocean levels due to melting polar ice and thermal expansion will expose buried fiber optic cables to seawater.

No thought was given to climate change

Much of the infrastructure at risk is buried and follows long-established rights of way, typically paralleling highways and coastlines. The roots of the danger emerged inadvertently during the Internet’s rapid growth in the 1980s before there was widespread awareness of the Internet as a global grid or the massive threats of climate change. Professor Barford says, “When it was built 20-25 years ago, no thought was given to climate change.”

To reach this conclusion, the team combined data from the Internet Atlas and projections of sea level incursion from the National Oceanic and Atmospheric Administration (NOAA).

Fiber optic cableScience Daily says the findings of the study, serve notice to industry and government. “This is a wake-up call. We need to be thinking about how to address this issue.Mikhail Chester, the director of the Resilient Infrastructure Laboratory at the University of Arizona told National Geographic, This new study “reinforces this idea that we need to be really cognizant of all these systems because they’re going to take a long time to upgrade.

ISP responses to climate change

The impact of mitigation such as sea walls, according to the study, is difficult to predict. “The first instinct will be to harden the infrastructure,” Professor Barford says. “But keeping the sea at bay is hard. We can probably buy a little time, but in the long run, it’s just not going to be effective.”

US shore susceptible to flooding

The study called individual internet service providers. They found finding that AT&T (T), Verizon (VZ), and CenturyLink (CTL), at most risk. In response, AT&T spokesman Jeff Kobs told NPR,

AT&T uses fiber optic cable “designed for use in coastal areas as well as being submerged in either salt- or fresh-water conditions,… In certain locations where cabling will be submerged for long periods of time or consistently exposed, such as beaches or in subways, we use submarine underwater cabling.

Verizon spokeswoman Karen Schulz told NPR,

After Sandy, we started upgrading our network in earnest, and replacing our copper assets with fiber assets … Copper is impacted by water, whereas fiber is not. We’ve switched significant amounts of our network from copper to fiber in the Northeast.

She explained that Verizon’s focus on flood risk

really has less to do with sea-level change and more to do with general flooding concerns … For cable landing stations that are very close to the oceans and that have undersea cables, we specifically assess sea-level changes.

A representative of CenturyLink told Popular Mechanics they can handle the problem. The company’s PR rep said that CenturyLink networks are designed with redundancy and can divert traffic to alternate routes when infrastructure goes down.

rb-

Donald Trump Still Doesn’t Believe in Climate ChangeThe Verizon and CenturyLink responses seem to totally miss the point.

The impact of large-scale Internet failures goes beyond Facebook and iTunes. The failure of the Internet would disrupt many real people’s day-to-day services like online banking, traffic signals, and railroad routing; the sharing of medical records among doctors and hospitals, and the growing “internet of things” that includes household appliances to regional grids of electric power production and transmission.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Undersea Data Center

Updated 08/09/2019 – Microsoft has installed two underwater cameras that offer live video feeds of the sunken data center. You can now watch all kinds of sea creatures swimming around a tank that holds 27.6 petabytes of data.

Undersea Data CenterFollowers of the Bach Seat know that Microsoft (MSFT) has experimented with undersea data centers to save costs associated with deploying data centers. Back in 2015, I wrote about MSFT’s initial experiment off the California coast where MSFT first tried out the idea of an underwater data center. Redmond has announced phase 2 of Project Natick. Phase 2 of Project Natick is designed to test the practical aspects of deploying a full-scale lights-out data center underwater called, “Northern Isles.”

Undersea Data CenterKurt Mackie wrote in an article at Redmond Magazine that Microsoft is testing this underwater data center off the coast of Scotland near the Orkney Islands in the North Sea. Microsoft wants to place data centers offshore because about half the world’s population lives within 125 miles of a coast. Locating data closer to its users reduces latency for bandwidth-intensive applications such as video streaming and gaming, as well as emerging artificial intelligence-powered apps. Latency is the time it takes data to travel from its source to customers. It is like the difference between using an application on your hard drive vs. using off the network.

Mr. Mackle posts that the original underwater data center had the computing power of 300 PCs, Phase 2’s computing power is equal to “several thousand high-end consumer PCs,” according to Microsoft’s FAQ page. This next-generation underwater data center requires 240KW of power, is 40 feet in length, and holds 12 racks with 864 servers. The submarine container is mounted on a metal platform on the seafloor 117 feet deep. The Phase 2 data center can house 27.6 petabytes of data. A fiber-optic cable keeps it connected to the outside world. Naval Group, a 400-year old French company built the submarine part of the project.

The interesting part (U.S. Navy submarines have had computers onboard for years) is the lights-out part. Lights out allow Microsoft to change up how data centers are deployed. Northern Isles’s cooling techniques are changed. The cold-aisle temperature is kept at a chilly 54F (12C) to remove the stress temperature variations place on components. This temperature is maintained by using a heat-exchange process developed for cooling submarines. Ben Cutler, Microsoft Research Project Natick lead told Data Center Knowledge, “... by deploying in the water we benefit from ready access to cooling – reducing the requirement for energy for cooling by up to 95%.”

heat exchangerWith Phase 2, Mr. Cutler explained to DCK there’s no external heat exchanger, “We’re pulling raw seawater in through the heat exchangers in the back of the rack and back out again.” This cooling system could cope with very high power densities, such as the ones required by GPU-packed servers used for heavy-duty high-performance computing and AI workloads.

According to DCK the first iteration of Project Natick had a Power Usage Effectiveness (PUE) rating of 1.07 (compared to 1.125 for Microsoft’s latest-generation data centers). The lower the PUE metric, the more efficiently the data center uses electricity. Microsoft hopes to improve the PUE for the phase 2 data center.

off-the-grid tidal power.Data centers are believed to consume up to 3% of the world’s electricity. The new cooling options change up the Northern Isles data center power requirements. It can run off the Orkney Islands’ local electrical grid which is powered by renewable wind, solar and tidal sources. One of the goals of the project is to test powering the data center with an off-the-grid source, such as using nearby tidal power.

Future versions of the underwater data center could also have their own power generation. Mr. Cutler told DCK, “Tide is a reliable, predictable sort of a thing; we know when it’s going to happen … Imagine we have tidal energy, we have battery storage, so you can get a smooth roll across the full 24-hour cycle and the whole lunar cycle.”

This would allow Microsoft to do away with backup generators and rooms full of batteries. They could over-provision the tidal generation capacity to ensure reliability (13 tidal turbines instead of 10, for example). Mr. Cutler says, “You end up with a simpler system that’s purely renewable and has the smallest footprint possible.”

 Northern Isle underwater data centerThe Northern Isle underwater data center is designed to run without being staffed. This fact cuts down on human errors. It is designed with a “fail-in place” approach where failed components are not serviced, they are just left in place. Operations are monitored by artificial intelligence. Mr. Cutler said, “There’s a lot of data showing that when people fix things they’re also likely to cause some other problem.

By operating in ‘lights out’ node with no human presence, allows most of the oxygen and water vapor to be removed from Northern Isles’ atmosphere. MSFT replaced Oxygen with 100% dry nitrogen. This environment should greatly cut the amount of corrosion in the equipment, a major problem in data centers on land.  Mr. Cutler told DCK, “With the nitrogen atmosphere, the lack of oxygen, and the removal of some of the moisture is to get us to a better place with corrosion, so the problems with connectors and the like we think should be less.

The Redmond Magazine article says Project Natick’s phase 2 has already proved that it’s possible to deploy an underwater data center in less than 90 days “from the factory to operation.” The logistics of building underwater data centers are very different from building data centers on land. Northern Isles was manufactured via a standardized supply chain, not as a construction process.  Mr. Cutler said, “Instead of a construction project, it’s a manufactured item; it’s manufactured in a factory just like the computers we put inside it, and now we use the standard logistical supply chain to ship those anywhere.

standard ISO shipping containerThe data center is more standardized. It was purposely built to the size of a standard ISO shipping container. It can be shipped by truck, train or ship. Naval Group shipped Northern Isles to Scotland on a flatbed truck. Mr. Cutler told DCK, “We think the structure is potentially simpler and more uniform than we have for data centers today … the expectation is there actually may be a cost advantage to this.”

The rapid time to deploy these data centers doesn’t only mean expanding faster, it also means spending fewer capital funds. Mr. Cutler explained, “It takes us in some cases 18 months or two years to build new data centers … Imagine if instead … where I can rapidly get them anywhere in 90 days. Well, now my cost of capital is very different … As long as we’re in this mode where we have exponential growth of web services and consequently data centers, that’s enormous leverage.

rb-

If Project Natick stays on the same trajectory, MSFT could bring data centers to any place in the developed or developing world without adding more stress on local infrastructure. MSFT’s Cutler told DCK “There’s no pressure on the electric grid, no pressure on the water supply, but we bring the cloud.”

As more of the world’s population comes online, the need for data centers is going to skyrocket, and having a fast, green solution like this would prove remarkably useful.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Run Your DC with a Chevy

Run Your DC with a ChevyGeneral Motors (GM) is using Chevy Volt batteries to power a data center. MLive reports that expired lithium-ion batteries retrieved from Chevrolet Volt’s help power the General Motors Enterprise Data Center at the Milford Proving Grounds in Milford, MI.

GM logoGM recently announced that five batteries from first-generation Volts are working in parallel with a 74-kilowatt solar array and two 2-kilowatt wind turbines to green up the data center. The batteries have the capacity to provide backup power for four hours in the event of an outage, GM said. According to the article, the set-up has given the Enterprise Data Center a net-zero energy use on an annual basis, and extra power will be sent back to the grid used by the Milford Proving Ground.

First-gen Chevy Volts still have a lot of juice

As it readies to sell its all-new, second-generation Volt, GM said first-gen cars still have a lot of leftover juice in their battery packs for stationary use. Pablo Valencia, GM’s senior manager of battery life cycle management, said in a presser that the batteries still have value after they come out of the car.

Chevy Volt batteries to power a data center.Even after the battery has reached the end of its useful life in a Chevrolet Volt, up to 80 percent of its storage capacity remains … This secondary use application extends its life, while delivering waste reduction and economic benefits on an industrial scale.

The first-generation plug-in hybrid Volt went on sale in 2010 for the 2011 model year. It uses battery power to get an electric range of about 35-38 miles, before switching to gasoline.

Battery powered carThe 2016 Volt, unveiled last January in Detroit, will have about a 31% greater electric range than its predecessor. The second-gen Volt has about a 50-mile, all-electric range, and a total driving range of about 400 miles when combined with a gasoline engine.

Rb-
According to the Detroit News, GM is working with unidentified partners to validate and test systems for other commercial and non-commercial uses. 

Elon Musk‘s Tesla (TSLA) is also leveraging its car-based battery systems to develop a line of storage batteries designed for homes and SMB’s called Powerwall. Powerwall is designed to store electricity for home use, to be used during peak consumption times when utilities charge the most. The device comes in several colors including white, charcoal, red, and blue. There are two options — a 7-kilowatt-hour package using nickel-manganese-cobalt batteries and a 10 kilowatt-hour unit with a nickel-cobalt-aluminum battery.

 

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Energy Harvesting Displays

Energy Harvesting DisplaysOver 90 percent of the displays sold will use liquid-crystal display (LCD) technology. However, LCDs are tremendously energy inefficient, converting only about 5 percent of the light produced by a backlight into a viewable image. The LCD in a notebook computer consumes one-third of its power. MIT’s Technology Review reported on efforts at the University of Michigan to improve the efficiency of LCD panels and boost the battery life of phones and laptops.

Benq LCD monitorThe LCD screen remains dominant because manufacturers can make LCDs inexpensively on a vast scale. More energy-efficient displays are either too expensive to manufacture or cannot produce high-quality images. “The LCD is very inefficient, but it works,” Jennifer Colegrove at Display Search, a market research and consulting firm, told TR.

At Michigan, they are tackling one of the biggest culprits of wasted light in LCDs: color filters. The group, led by Jay Guo, is developing energy-harvesting color filters. Color filters are used in many displays, but the ones by Professor Guo’s team are appropriate for use in reflective “electronic paper” screens. These contain sub-pixel arrays that absorb ambient light and reflect red, green, or blue light.

Energy efficiency at Michigan

University of MichiganDr. Guo and his U of M colleagues combined a common polymer solar cell material with a color filter that his group invented last year. The photovoltaic color filter converts about two percent of the light that would otherwise be wasted into electricity.

U of M’s Guo estimates that full displays incorporating this photovoltaic filter could generate tens of milliwatts of power, enough to extend the life of a cell phone battery. The photovoltaic color filter is described in a paper published online in the journal ACS Nano.

“It’s an intriguing idea,” says Gary Gibson, a scientist developing reflective color displays at HP Labs in Palo Alto, California. Low brightness is a recurring problem for color electronic paper. If the color filter proves practical, says Gibson, energy harvested from ambient light could power a backlight and make the display brighter.

rb-

Go BlueHarvesting energy from the environment with the device is a trick that could boost the battery life of phones and laptops. Oh yeah, the article also talked about similar work at UCLA. Go Blue!

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

iDEN Shutdown is a Massive Recycling Project

Sprint iDEN Shutdown Makes Massive Recycling ProjectSprint Nextel (S) is set to shut down its Nextel iDEN network to make room for LTE. The shutdown will result in nearly 30,000 iDEN installations being taken off the air. All of that ewaste needs to be part of a recycling Project.

FierceBroadbandWireless explains that Sprint has deployed FDD-LTE using the 1900 MHz Band 25 spectrum. Sprint holds two 5 MHz channels in the G band adjacent to the PCS spectrum. The carrier’s Band 26 800 MHz spectrum is currently used for CDMA as well as end-of-life iDEN service. Sprint will gain another two 5 MHz channels for LTE once it shutters its iDEN network on June 30 and re-purposes that 800 MHz spectrum for LTE.

Sprint without Nextel logoAccording to Sprint, its last full day of iDEN service will be June 29. Sprint said it will close switch locations “in rapid succession on June 30.” After the shutdown equipment will be powered down and backhaul at each cell site will be eliminated. Tens of thousands of iDEN cell sites will be deconstructed and taken off the air. Sites, where CDMA and LTE equipment are colocated, will be left intact, minus the iDEN gear, said Sprint.

100 million pounds of recycling

The shutdown will generate over 100 million pounds of leftover iDEN network gear. The equipment and materials include cables, batteries, radios, server racks, antennas, air conditioners, and other equipment. Much of the equipment s being staged for recycling vendors. Most concrete shelters housing iDEN cell sites will be crushed and turned into a composite for roads and bridges, said Sprint.

Recycling a nationwide wireless network is a huge undertakingThe iDEN recycling project is expected to continue into early 2014. “Recycling a nationwide wireless network is a huge undertaking, but one that we’re committed to,” said Bob Azzi, senior vice president-network. “The company has earned a reputation for environmental stewardship. The iDEN recycling effort extends our commitment.

The market for used iDen equipment is pretty limited. GigaOm points out that iDEN is a dying technology, and Nextel was the world’s largest iDEN carrier. iDEN’s sole manufacturer, Motorola Solutions, still supports the technology, and a handful of operators in North and South America, as well as Asia, still use it.

make money from recyclingThe recycling and reusing move isn’t just about PR. GigaOm says that Sprint can save significant money by reusing its tech. They could make money from recycling if it sells the scrap to a waste vendor. There are also some state laws that require the recycling of certain types of e-waste, particularly substances that could be hazardous material that could seep into a landfill.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.