Tag Archive for Data center

Whats a Petabit Network

Whats a Petabit NetworkSeems like it was a couple of months ago, we were excited about fiber optic cable that twisted light to carry data at 1.6 Tbps per strand. Now a Petabit network is the new benchmark. U.K. and Japanese researchers mashed up software-defined networking (SDN) and multicore fiber to produce the first Petabit pipe according to Kevin Fitchard at GigaOM. A Petabit is one quadrillion (1,000,000,000,000,000 or 1015) bytes binary digits or one thousand Terabits.

Petabit network uses multicore fibers

Whats a Petabit NetworkThe researchers mashed up multicore fibers and SDN to makes very high-speed networks programmable. GigaOM speculates this will allow carriers to adjust the network capacity and latency to meet the needs of traffic traveling over their networks. First, GigaOM explains that the fiber is unlike today’s single strands of glass, or cores, that carry a single beam of light down the fiber. Multicore fiber is exactly what its name implies: multiple cores each carrying a single core’s worth of capacity over the same link. Professor Dimitra Simeonidou at the University of Bristol called current single-core fiber a capacity bottleneck.

Space Division Multiplexed

The multicore group, led by NICT and NTT in Japan which built a 450 km (280 miles) section of fiber optics using 12 cores in two rings capable of transmitting 409 Tbps in either direction. That’s 818 Tbps in total. Which is within spitting distance of seemingly mythical Petabit speeds according to GigaOM. The MCF research relies on Space Division Multiplexed (SDM) provided by the multicore fibers.

ResearcherIn order to control the massive bandwidth, a team from the High Performance Networks Group at the University of Bristol created an OpenFlow software-based control element to manage those enormous capacities. The Brits implemented an interface that dynamically configures the network nodes so that it can more effectively deal with application-specific traffic requirements such as bandwidth and Quality of Transport.

According to the researchers, this was the first time SDN was used on a multicore network. The University of Bristol presser announcing the new technology says this technology will overcome critical capacity barriers, which threaten the evolution of the Internet.

rb-

OK, so that really – really – really fast. We also know from a 2011 New Scientist article that the total capacity of one of the world’s busiest routes, between New York and Washington DC, is only a few Terabits per second. With bandwidth-hungry applications like cloud computing, social media, and video-streaming continuously growing it forces network planners at firms like AT&T (T), Verizon (VZ), and the NSA to find new ways to grow their capacity.

Data center

Comcast (CMCSA) just finished a 1 Tbps network field trial on a production network between Ashburn, VA, and Charlotte, NC. Most likely the first place Pbps networking will be used is in the mega-data centers of the likes of Google (GOOG), Facebook (FB), or Microsoft (MSFT).

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

More Tech Trouble at School

More Tech Trouble at SchoolIt’s not a good time for tech in schools. The security woes at school are not limited to the iPad debacle at LAUSD. (rb- You can see my coverage here – Updates since the first article – LAUSD started confiscating the iPads and delayed the district-wide roll out one year until 2015.) GigaOM’s Ki Mae Heussner writes that Guilford County Schools in North Carolina has suspended its tablet program with Rupert Murdoch’s News Corp’s Amplify after reports of faulty equipment.

NC school district suspends tablet program

uspended its tablet programThe district reportedly spent $16.4 million ($299 / device + a 2-year subscription at $99 per year) of a $30 million Race to the Top grant to pay for the tablets and content. The device is a 10-inch ASUS (2357) tablet running the Google (GOOG) Jellybean Android operating system. It comes pre-loaded with content and apps curated by Amplify. It enables teachers to distribute content across a class or grade level and control the content on students’ screens.

GigaOM cites the school district’s website, which says they have sent 10% of their 15,000 devices back to Amplify because of broken screens. About 2,000 cases have also been problematic. In one instance, a student returned a defective charger, reporting that overheating caused the plastic to melt. While the district said it expected a few glitches with the rollout, school officials decided to pause the program for safety’s sake. GigaOM claims the pause is a big setback for Amplify, which launched its education-optimized tablet at hipster South by Southwest earlier this year.

NewsCorpSince its launch, skeptics have wondered how schools would respond to the privacy questions and the prospect of doing business with Amplify’s parent company News Corp. (given its phone-hacking scandal). Ms. Heussner speculates that the suspension could give schools more reason for pause when it comes to embracing the new technology.

Asus told GigaOM that out of 500,000 chargers of its kind that they have shipped globally, only the one in Guilford overheated and melted. Justin Hamilton, Amplify’s SVP of corporate communications seems to be blaming the customer. He claimed the broken screen rate in Guilford is higher than in other school districts. “We’re working very closely with the district on this and hope to have things resolved and the program back up and running very soon,” Mr. Hamilton said.

Indiana mobile security fail

circumvented the security on district-issued Apple iPadsIn Indiana, Education Week reports that between 300 and 400 students in the Center Grove school district circumvented the security devices on district-issued Apple (AAPL) iPads within hours of receiving the devices according to a report last week in the Daily Journal.

Apparently, students found ways to reprogram the iPads so they could download games and apps for social media sites, according to the report. Center Grove officials attributed the problem to their security program not being able to handle the 2,000+ devices they distributed.

spread like wildfireKeith Krueger, the CEO for the Consortium for School Networking, said such problems are increasingly common as districts deploy an increasing number of devices. “Kids and adults find ways to hack through things, and it can spread like wildfire,” he said. “It’s frustrating, and it’s a huge challenge for any district.

Data center failures

In addition to the tablet troubles, Data Center Knowledge’s Rich Miller reports several school data center failures. According to DCK, two public school systems suffered data center failures that crippled their IT systems.

data center fire suppression systemIn Oregon, the Beaverton School District experienced several days of disruption after an errant alarm set off its data center fire suppression system. The fire suppression system damaged hard drives and servers. That left Beaverton schools unable to use email or access class lists, student schedules, and online textbooks. “It knocked all of the systems in the data center off-line,” said Steve Langford, chief technology officer. “All of the systems that staff needs to do their jobs.” District IT staff worked over the Labor Day weekend to replace the damaged systems.

In California, the Davis Unified School District started school without key IT services after the district’s servers overheated. DCK reports an air conditioner unit failed, allowing the temperature in the server room to rise to 120 degrees F. “There’s an incredible impact on everyone in the whole organization,” says the district’s Kim Wallace. “Students can’t access computers. Teachers can’t take attendance. Parents can’t email. We can’t email out.” The DCK article said staff were still troubleshooting damaged equipment and lost data.

rb-

The best strategy, COSN’s Krueger said, is to combine the best possible security filters and other technical measures with a comprehensive responsible or acceptable use policy that students and families must sign and a commitment to enforcement. “It’s not surprising that a school district would have some breaches,” he said. “The question is how do you leverage it into a teachable moment?”

Who needs the teachable moment? Sure the kids need to understand there are real consequences for their actions but, can the politicians administrators be taught to be serious about IT? Seems to me that most of these failures are management failures. It is probable that these failures could have been reduced with proper project management.

proper project managementIt is my experience that many administrators do not recognize project management professionals. It appears they would stick with the good ole boy network and hire their less qualified friends or the professional BSer’s.

Now about project management? Modern backup system? Disaster Recovery plan?  BCP?

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Data Growth Tests Storage Capabilities

 Data Growth Tests Storage CapabilitiesData Center Knowledge had an article by Steven Rodin, CEO of Storagepipe Solutions, that lays out the challenges that those of us charged with managing backups face every day. Storagepipe Solutions, which has been a provider of online backup services for business since 2001, has identified several emerging storage trends that organizations will need to overcome in the future.

Storagepipe Solutions logoIn the early days, the author says, organizations were primarily concerned with data protection, encryption and automation. The era of “Big Data” has changed those demands. The new demands are overwhelming most backup and storage systems. The article cites data from IBM (IBM), which claims that worldwide annual data production has actually exceeded worldwide storage capacity. Big Blue believes that demand for storage capacity is growing nearly 60 percent a year. The gap between the data that organizations produce and their ability to store it will continue to grow for years.

The Storagepipe Solutions CEO identified a number of important storage trends which are accelerating the growth rate of corporate data.  He provided a few of the most important factors.

Cheaper Storage Hardware

Cheaper HardwareHard drive capacity has fallen exponentially in price ever since Moore’s Law was introduced. This has changed attitudes to backups. The article says that today, hardware is so cheap and abundant that attitudes have shifted to a “Better keep this. We may need it someday” mentality.

New technologies, such as advancements in compression, deduplication and hardware virtualization, have improved overall storage utilization and further accelerated the rate at which the cost-per-gigabyte of storing data is falling.

Cheap and Abundant Bandwidth

Abundant BandwidthInternet bandwidth is no longer a bottleneck. Bandwidth availability has accelerated the growth of file sharing and online storage. Now large files are copied and distributed at an exponential rate which has caused duplicate data to become a major source of storage waste and data growth. The CEO of the firm based out of Toronto, calculates that if one person shares a 1GB file with 500 people, that’s half a terabyte of storage consumption.

Business is Going Paperless – Email has replaced letters, eBooks and tablets have nearly replaced paper books, and digital imaging has replaced photographs and x-rays. Not only are paperless offices better for the environment, but Mr. Rodin writes, they are also more productive, flexible and better able to extract value from their business data. Many industries are using more and more video (which is highly storage intensive) for marketing online, security and communication.

Enhanced Automated Data Collection Capabilities

Automated Data CollectionAutomated data collection is one of the fastest-growing areas in the “big data” space. With every move we make, the article says we’re generating GPS data, web traffic statistics, power usage data, surveillance video, and a broad range of data which companies and governments are collecting.

The author calls automated data collection the “Pandora’s Box” of the big data revolution. The information being collected about us through the electronic devices we use every day could present a threat to our privacy, but they also have the potential to offer tremendous value to society.

Advances in Data Analysis Technology

Data AnalysisThe blog says that until recently, data analysis was almost exclusively performed on structured relational databases, maintained and organized by humans. But now, a  new approach to data storage which focuses on rapid analysis and processing of vast data volumes. Technologies like Hadoop, Cassandra, MapReduce and NoSQL have given birth to a whole new class of services, and have revolutionized the way organizations think about the data they collect. Organizations can now get more insight into their internally generated business data by integrating external feeds and databases into their reporting and analysis.

The Growing Strategic Importance of Data

In the past, data was simply a tool which assisted in decision-making and helped companies execute on their strategic objectives. But recently Google (GOOG), Facebook (FB), Apple’s (AAPL) iTunes and other brands have built their entire corporate strategy around the data they own. The DCK article states, information is power, and it’s now more powerful than ever.

Regulatory ComplianceRegulatory Compliance

Even if companies wanted to cut the amount of data they store, they wouldn’t always be able to. Laws like PIPEDA, HIPAA, Sox404 and many others are forcing companies to keep historical archives of their exponentially growing business data going back several years.

As this data grows, storage increasingly becomes a major business problem. Also, companies must plan for cost-efficient search and retrieval of these large historical data volumes to stay ready for an unexpected electronic discovery request.

As the scale and complexity of big data storage grows, it’ll quickly reach a point where manual handling is no longer practical, desirable, economical, or even possible. Automation will become absolutely essential when it comes to backing up big data.

Many big data applications have serious privacy implications for the customers that benefit from their use. So security will become a top priority for backup administrators. Gone are the days of unencrypted backup tapes.

The big data applications has created a whole new class of applications built on real-time data. These applications require much more frequent  backups to optimize Recovery Point Objectives (RPOs). Strategic big data apps will need minimized downtime. This means smaller backup windows, built-in redundancy, and server fail-over to disaster recovery sites.

That’s why many organizations are opting to outsource their data backups by partnering with experts who run ahead of the trends and who can help with the complexity of some situations.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers and anything else that catches his attention since 2005. You can follow him at LinkedInFacebook and Twitter. Email the Bach Seat here.

MSFT Powers Data Center with Sewers

MSFT Powers Data Center with SewersThe prize for the most unlikely clean power source may be going to Microsoft (MSFT). Greenbz.com reports that the boys from Redmond are working on powering data centers with sewage. Microsoft plans to power a demo data center with sewage, yeah poo.

Microsoft logoFuelCell Energy (FCEL) recently revealed to the blog, it is working with MSFT on a $5.5 million trial. The trial will use biogas from a wastewater treatment facility to power a fuel cell. The fuel cell at Dry Creek Water Reclamation Facility in Cheyenne, WY will provide “ultra-clean and carbon-neutral electricity” to a Microsoft data center.

InfoWeek says that biogas consists mostly of methane and carbon dioxide. It may also contain small amounts of other gasses, including hydrogen sulfide and nitrogen. The power is produced by anaerobic digestion. Anaerobic digestion is a process in which bacteria that live only in places without air break down organic, biodegradable matter.  Biodegradable matter is better known as sewage, animal manure, municipal waste, and plant material.

Fuel cell diagramThe initial trial will use one of FuelCell Energy’s sub-megawatt Direct FuelCell (DFC) power plant systems. The DFC will generate 200 kW of power for a Microsoft IT pre-assembled component (ITPAC) modular data center. The ITPAC is set up to resemble a standard data center environment. Any electricity not used by the data center will help power the water treatment plant. The system will also provide usable heat for the facility.

Direct FuelCell power plant systems

Power Engineering explains that stationary DFC power plants convert a fuel source into electricity and usable high-temperature heat suitable for making steam. DFC plants are fuel flexible, capable of operating on natural gas, renewable biogas, directed biogas, and other fuels including propane. The fuel cell generates electricity and heat electrochemically.

Gregg McKnight, general manager for data center advanced development at Microsoft, told Greenbiz.com that with the company has recently committed to becoming “carbon neutral” by 2013 it was committed to exploring the viability of a number of renewable energy sources. He is quoted in the article, “… Microsoft is researching new methods to help our operations become more efficient and environmentally sustainable,” he said.  “This project will study methods to provide an economical and reliable power supply for data centers that is also scalable and economical for use by other industries.”

rb-

OK let the snarky comments rip about MSFT software powered from the sewer or as one commenter noted, leave it to Microsoft to power its cloud services with a very different kind of cloud — a smellier, gaseous one.

I covered HP’s (HPQ) plans to power its data centers with cow manure here. It looks like Microsoft aims to build more data plants near other sources of renewable energy like landfills, wastewater treatment plants, and even dairy farms.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Microsoft Eliminating Backup Generators

MSFT Eliminating Backup GeneratorsThe venerable diesel backup generator has long been a symbol of reliability for mission-critical installations. Backup generators provide the emergency power required to keep servers online during utility power outages. Data Center Knowledge reports that the growing focus on using clean energy to power large data centers is prompting Microsoft (MSFT) and other tech titans to ditch their generators, along with their diesel fuel emissions.

backup energy options Microsoft is the latest company to announce its intention to cut its use of diesel generators. The move is part of a broader initiative to make Microsoft’s server farms more sustainable and less reliant on the utility grid according to DCK. Microsoft Utility Architect Brian Janous wrote in a recent blog post.

We are currently exploring alternative backup energy options that would allow us to provide emergency power without the need for diesel generators, which in some cases will mean transitioning to cleaner-burning natural gas and in other cases, eliminating the need for backup generation altogether.

Bloom Energy, fuel cellsDKC speculates that the reference to natural gas suggests that Microsoft is preparing to add fuel cells to replace its generators. That could be good news for Bloom Energy,  which recently scored wins to replace generators and UPS units at new data centers of eBay (EBAY) in Utah and supplement power Apple‘s (AAPL) data center in North Carolina.

DCK explains the Bloom Energy Server is a solid oxide fuel cell technology that converts fuel to electricity through an electrochemical reaction, without any combustion. The Bloom box can continue to run during grid outages because they are housed at the customer premises. Bloom boxes can run on natural gas or a range of other biofuels, including methane gas from landfills.

Diesel engine exhaust is a regulated pollutantAnother reason MSFT may be replacing generators according to DCK is that they have caused Redmond several headaches in recent years, including an Azure cloud outage in Europe (when multiple generators didn’t start during a utility outage) and public controversy about whether the diesel emissions from Microsoft’s generators in Quincy, WA could cause health problems for local residents. Diesel engine exhaust is a regulated pollutant and can be toxic in high concentrations.

Or Microsoft’s motivation could be to become less dependent on the utility grid and use renewable energy to power its servers the blog says. The company says its “data plants” will break new ground in integrating electricity and computing, bring together data centers and renewable power generation.

Biomass waste-powered data center.One type of renewable energy Microsoft has explored is a waste-powered data center. It could be built on the site of a water treatment plant or landfill. In his blog post, Janous indicated that Microsoft is evaluating a biomass project in Europe (rb- I wrote HP’s plan to use manure to run a data center here).

Given the unreliability of the electric grid and the need for continuous availability of cloud services, Microsoft maintains diesel generator backup at all of our data centers…” Janous wrote. “These generators are inefficient and costly to operate. From both an environmental and a cost standpoint, it makes no sense to run our generators more than we absolutely must.”

Microsoft data centerMicrosoft is also considering “long-term purchases from larger grid-connected installations that would displace some portion of our grid purchases,” Janous wrote. Google (GOOG) has embraced a similar strategy, using power purchasing agreements to add more than 200 megawatts of wind power to the local utility grids that support its data centers.

Microsoft is taking steps to position itself to make bulk power deals according to DCK. “We have recently signed on as an advisory board member with Altenex, an operator of a network that enables member companies to more efficiently engage with developers of renewable energy projects,” Janous said. “We expect this engagement with Altenex to improve our ability to identify and evaluate cost-effective clean energy projects.

rb-

Cummins logoI recall as a newbie techie the first time I had to be in on Sunday morning to work with the site engineer to crank up the 100 HP Cummins standby generator. The firm ran the monthly test to make sure the critical systems stayed up. The generator was enclosed in a secure room that contained the heat and noise. The exhaust was vented out. One of my regular jobs was to kick the standby 55-gallon drum of diesel with the hand pump on it to make sure there was fuel available for the generator.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.