Tag Archive for IBM

Data Centers To Go Wireless

Data Centers To Go Wireless

MIT’s Technology Review reports researchers from IBM (IBM), Intel (INTC), and the University of California, Santa Barbara have come up with a way to improve data transmission in data centers. Heather Zheng, associate professor of computer science at UCSB who led the research says wireless is the answer to the in-rack cabling mess usually found in data centers. In their paper (PDF), the researchers say that transmitting data wirelessly within a data center would be simpler than rewiring data for tech titans like Google (GOOG), Facebook, or Twitter.

Line-of-sight connections

WiFi radio wavesThe earlier challenge for multi-gigabit wireless in the data center was it required a line-of-sight connection to be useful. Achieving the required data center speed could not happen in the maze of metal racks, HVAC ducts, and electrical conduits that make up most data centers.

TR reports that the researcher’s solution is to bounce 60-gigahertz Wi-Fi signals off the ceiling, which could boost data transmission speeds by 30 percent. Stacey Higginbotham at GigaOm points out that this could result in data transfers up to 500 Gigabits per second. She says current Ethernet cables in data centers are generally 1, 10, or maybe 40 gigabits per second.

60-gigahertz Wi-Fi for servers

Data center ceiling WiFiMs. Zheng and colleagues used 60-gigahertz Wi-Fi, which has a bandwidth in the gigabits-per-second range and was developed for high-definition wireless communications according to TR. However, it has its limitations, says Ms. Zheng. To maximize the bandwidth and reduce interference between signals, it needs to use 3D beamforming to focus the beams in a direct line of sight between endpoints. “Any obstacle larger than 2.5 millimeters can block the signal,” she says in the TR article.

One way to prevent the antennas from blocking each other would be to allow them to communicate only with their immediate neighbors, creating a type of mesh network. But that would further complicate efforts to route the data to the proper destinations, Professor Zheng told TR. Bouncing the beams off the ceiling directly to their targets not only ensures direct point-to-point communication between antennas but also reduces the chances that any two beams will cross and cause interference. “That’s very important when you have a high density of signals,” she says.

Flat metal plates placed on the ceiling offer near-perfect reflection. “You also need an absorber material on the rack to make sure the signal doesn’t bounce back up,” says Ms. Zheng.

Wireless can add 0.5 terabytes per second

Data centerAccording to Technology Review, the UCSB team worked with Lei Yang from Intel Labs in Oregon and Weile Zhang at Jiao Tong University in Xi’an, China, to simulate a 160-rack data center to see how the system might work. “Our simulation shows that wireless can add 0.5 terabytes per second,” she says.

IBM is also looking into using wireless technology in data centers, Scott Reynolds, a researcher at IBM’s T.J. Watson Research Center in Yorktown Heights, NY, who has been developing 60-gigahertz systems told TR. “These data centers are just choked with cables,” he says. “And so every time you want to reconfigure one it’s very labor-intensive and expensive.” But one problem with turning to wireless transmission, he adds, is that “you need to have hundreds of these wireless data links operating in a data center to be useful.” Since 60-gigahertz Wi-Fi has only four data channels, it’s important to configure the beams so they don’t interfere with each other.

Mark Thiele, the EVP of data center technology at Switch CommunicationsSuperNAP data center, told GigaOm that the research is worth following as low-latency networking inside the data center can be a bottleneck today for applications that range from financial trading to trying to move gigantic data sets around.

TR reports Ms. Zheng and her colleagues are now working on building a prototype data center to put their solution into practice.

rb-

Cable mess under a raised floorHaving just done a small data center cleanup, the idea is appealing. We pulled out 2 generations of cabling, IBM Type 1, and a bunch of Cat 3 multi-pair out from under the deck.

Ms. Higginbotham says the choice of 60 GHz for the data center is a smart move. Intel is pushing 60GHz for consumer use, under the WiGig brand (I wrote about WiGig in 2010 here). This means the chips would be cheap.

Some of the possible security issues raised by running Wi-Fi in the data center are tempered by using the 60Ghz range. She says if you are worried about someone standing outside the data center trying to eavesdrop on the data you are transmitting the 60Ghz, signals deteriorate rapidly.

Of course, change is hard and data center guys are going to have to learn wireless and top-of-rack switches would have to get radio cards installed. The Wi-Fi reflective panels would have to be installed on the ceiling of the data center and the servers would need a signal-absorbing surface so the Wi-Fi signals don’t continually bounce around the data center.

Just if you are confused about WiGig, Wi-Fi, and IEEE, EETimes says, “WiGig forged a deal with the Wi-Fi Alliance so its 60 GHz approach can be certified as a future generation of Wi-Fi. The group has aligned its technical approach with the existing IEEE 802.11ad standards effort on 60 GHz.”

Now if only they could do wireless electricity……..

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

U.S. Firms Set Record Hiring H-1B Visa Holders

U.S. Firms Set Record Hiring H-1B Visa HoldersCorporate America’s assault on the middle class continues. Despite the jobless epidemic, U.S. companies are tripping over themselves to fill high-paying job openings with workers from overseas. The BusinessInsider reports that tech titans led by Microsoft (MSFT) and IBM (IBM), have already maxed out their allotment of 65,000 1H-1B employees.

The article says that U.S. companies have set a three-year record in how quickly they reached the cap for H-1B workers. The applications process for 2012 opened on April 1 and on November 23, the U.S. Citizen and Immigration Services department announced that the cap had been reached.

But there are more than 65,000 jobs at stake. The USCIS also received “more than 20,000 H-1B petitions filed for persons exempt from the cap under the ‘advanced degree’ exemption,” it said. In addition, petitions for workers who already have their visas are not counted toward the cap.

The H-1B visa is a temporary work visa for those classified as “skilled workers” such as IT staff, engineers, doctors, and scientists, and the pay is good. For instance, the average salary for a worker th

 

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Cloud Computing Risks

Cloud Computing RisksCloud computing is a term even non-IT folks would have heard about at least once by now fueled by the concept of Software-as-a-Service (SaaS) and virtualization. The idea is that IT services and processing capabilities could be more efficiently housed in a data center and delivered over the Internet based on demand.

Google logoDr. Dobb’s, editor-in-chief Andrew Binstock told FierceCIO that the primary advantage of relying on cloud providers is that their combined expertise on the security and reliability front is in all likelihood better than that of most SMBs and even some larger IT shops.

Bob Violino at Internet Evolution writes that cloud computing offers some clear benefits for organizations: lower costs, automated software updates, greater flexibility, and the ability for IT staff to focus on more strategic projects and not day-to-day maintenance tasks.

Apple logoIt’s easy to get caught up in the cloud excitement with major IT vendors such as Amazon (AMZN), Apple (AAPL), Dell (DELL), Google (GOOG), HP (HPQ), IBM (IBM), and Microsoft (MSFT) pushing the concept and rolling out cloud offerings. But organizations looking into cloud computing need to consider some key risks as well.

Larry Ellison, the chief executive of Oracle, told shareholders in 2008 that Cloud technology is a fad that lacks a clear business model. “I think it’s ludicrous that cloud computing is taking over the world,Ellison said. “It’s the Webvan of computing.”

Microsoft logoRichard Stallman, the founder of the Free Software Foundation, sees cloud computing as a trap that will result in people being forced to buy into locked and proprietary systems that will only cost more over time. He told The Guardian: “It’s stupidity. It’s worse than stupidity: it’s a marketing hype campaign.”

Some of the cloud risks are well documented, but as the push for cloud services continues, a few risk points are starting to come into focus:

Data privacyData Privacy. When it comes to the U.S., the Fourth Amendment states that people should “be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures…” But web-hosted applications and cloud services are too new for the courts to have been able to offer far-reaching guidance on data privacy online. Data stored outside of the country makes data privacy issues even more complex.

Information security. A report from the World Privacy Forum discusses the issues related to cloud computing and the privacy and confidentiality of information. According to the report, “for some information and for some business users, sharing may be illegal, may be limited in some ways, or may affect the status or protections of the information shared.”

Even when no laws prevent a user from disclosing information to a cloud provider, the report says, disclosure may still not be free of consequences. “Information stored by a business or an individual with a third-party may have fewer or weaker privacy or other protections than information in the possession of the creator of the information.” A cloud provider’s terms of service, privacy policy, and location may significantly affect a user’s privacy and confidentiality interests, the report states.

Data Security. There are many threats to data online. The application or service provider could go belly up, hackers could attack or just be locked out of your account. The good news is that data portability and security policies are being scrutinized closely by several organizations.

intensely naïve

Mr. Binstock observed that no cloud storage provider will promise that they will not access your data under any circumstances. It is also common to find explicit clauses that allow law enforcement agencies access to your data.

Believing that this is acceptable because there is nothing incriminating in one’s data storage, is, in his words, “intensely naïve.” The obvious problem, notes Mr. Binstock, is that any government agency examining your data is under no contractual obligation to you to keep them safe, or even delete copies that were created.

Neophobia

Chenxi Wang at Forrester noted that an effective assessment strategy must cover data protection, compliance, privacy, identity management, and other related legal issues. “In an age when the consequences and potential costs of mistakes are rising fast for companies that handle confidential and private customer data, IT security professionals must develop better ways of evaluating the security and privacy practices of the cloud services.”

Network. The idea of putting the network health in the hands of the ISPs is very troubling. Have you ever tried to work with an ISP to find out why your round-trip latency times are so high? can your organization confidently define: The bandwidth requirements of your apps? The end-to-end throughput needs? Where will your data really be? Will it take the same path today and tomorrow? Who will pick up the phone when you call to say “the cloud is slow?” Will you be able to understand them?

Complexity. As cloud computing evolves, “combinations of cloud services will be too complex and untrustworthy for end consumers to handle their integration,” according to a report from Gartner Inc.. Daryl Plummer, chief Gartner fellow notes:

ComplexityUnfortunately, using [cloud] services created by others and ensuring that they’ll work — not only separately, but also together — are complicated tasks, rife with data integration issues, integrity problems and the need for relationship management

Finances. Cloud computing changes the way software is purchased. The model for purchasing software one time and then choose to opt to buy the newer version a few years later maybe on the way out.  With cloud computing, the vendor can just raise the prices the following month. It requires a different mindset, of subscription fees as opposed to purchase. We will see how the public takes it.

These are some of the issues that must be addressed if companies are to decide that cloud computing offers benefits that exceed the ROI of providing similar services in-house without increasing risk.

rb-

Sure, “the cloud” will work for most people most of the time, but if there are a lot of users, there will be a lot of errors. With 100,000 users, 10% having problems over 10 years is 10,000 unhappy users.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Michigan Disaster Recovery Test Turns Into Disaster

MichiganState of Michigan IT officials are probably happy for a new week. The State of Michigan IT infrastructure took two big hits last week. The folks in Lansing had a failure on Monday 05-16-11 were nearly 25,000 employees were unable to use the state’s IT network for about three and a half hours, Kurt Weiss, public information officer for the Michigan Department of Technology, Management, and Budget (DTMB), said in a phone interview with InformationWeek. Apparently, an upgrade over the weekend to patch security holes had gone wrong somewhere, Mr. Weiss said. Access to the network was restored by 10:30 a.m.

ESCON cableOn Wednesday 05-18-11 a disaster recovery test at the Michigan DTMB turned into a disaster when a link to a mainframe computer was broken reports MiTechNews. Around noon Wednesday, a link between the test environment and production environment was severed by human error, taking out a mainframe computer. Mr. Weiss told MiTechNews

A fiber link was broken by a state employee … We were working on a disaster recovery test, performing a test on the mainframe. During the test we went from test to real life disaster. The cord between testing and real life was severed. Corrupted files got loaded on the mainframe, and we crashed the mainframe.

Mainframe computerThe “big iron” failure affected many state offices, including 131 Secretary of State branch offices, which run 80,000 daily transactions. Other state operations also were affected, including the departments of corrections, treasury, and human services. Data stored on the mainframe that was affected included the bulk of information about driver’s license and motor vehicle registration in the state,  the ability for police officers to look up driver’s license information (LEIN), or for automobile dealerships to transfer license plates for vehicles that they sold, Mr. Weiss said.

The mainframe was up and running by Wednesday night, but computer applications were still inoperable due to file corruption. The system was finally restored after 5:00 PM on Thursday according to Government Technology. The delay was caused by the data-recovery operations that were necessary as the result of file corruption during the outage.  “We have had outages before, but not to this length or scale or duration,” Mr. Weiss said, “and actually not to this level of complexity. This one has been a much more difficult one to fix compared to the other outages.”

The mainframe that went down last week also is part of an old system that is in need of modernization, Weiss said, but Michigan’s budget woes have so far prevented the state from doing the upgrades it needs. “We do need to modernize all of those applications for the secretary of state,” he told InformationWeek.

Former Gateway Computers CEO and current republican governor Snyder, when asked about the outage, told MiTechNews it is another reason the state has to get the budget approved so the state can focus on upgrading the old computer equipment used by the Michigan government. Some of this equipment is more than 30 years old.

The DTMB IT department is doing a root cause analysis of both incidents and plans to publish a “lessons learned” review of them once that is complete, Weiss said. No data was lost in either incident, although some data files were corrupted during the second and had to be restored through tape backup, he said.

IT officials are re-evaluating how to do such tests in the future in light of the incident, and another test will not be performed until this study is complete, he said.

rb-

snyder

Just put it back in the cow box

So now the boys and girls in Lansing know what it is like to work with ancient equipment because the Governor is cutting funding to everything to give a tax cut to businesses. I doubt that Snyder or his cronies have ever been in line for hours just to get new tabs. I have. Michigan needs to invest in its people and infrastructure not tax breaks for businesses.

What do you think?

Invest in people and infrastructure so people want to stay in Michigan?

or

Cut spending and raise taxes to give businesses more profits?

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

China Creates Cloud Computing City

China Creates Cloud Computing CityComputerWorld says that China is employing IBM (IBM) to help it build a city-sized cloud computing center based in northern China’s Hebei Province. The cloud computing center is being built in Langfang, a city between Beijing and Tianjin in the new Langfang Range International Information Hub, IBM spokeswoman Harriet Ip told ComputerWorld. The complex will open in 2016 and be comparable in size to the Pentagon.

IBM will be supplying its data design services, while the Chinese firm Range Technology Development, (Google translation) an Internet data center services provider, founded in 2009, will also be working on the project. There will initially be seven low-slung data centers, spanning up to one million square feet, with room for three more units on either side. There are reports that it includes a residential area, most likely for the staff working at the data centers. A ComputerWorld article says the facility will mainly serve government departments from China’s capital and across the country, but will also be open to banks and private enterprises.

ChinaComputerWorld cites IDC data that says despite such large-scale projects, China’s IT budget is five times lower than the US’s. IDC says China’s IT budget is growing at a higher pace than in the US. The research house days China’s full-year growth for 2011 will be $112 billion, while the US IT market will be $564 billion.

China’s IT industry isn’t that big at this point and “there is a lot of reliance on the vendors” to design data centers, Dale Sartor, an engineer at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, visited about eight data centers in China last year told ComputerWorld. Mr. Sartor expects to see accelerating data center development in China, particularly involving very large centers delivering cloud services. Large data centers may soon be the norm. DoE, Sator  says,

I got a sense that the cloud is going to be huge in China for both efficiency reasons as well as the ability to control … If everything was cloud computing and the government owns it, it’s much easier to keep your finger on the Internet and other issues than [by using] a very distributed model.

China’s rapid IT growth has been a plus for IBM, which said its growth in that country in 2010 was up 25% over the year before. According to ComputerWorld, IBM’s data-center business in China has tripled in the last four years. In 2010, China overtook Japan as IBM’s second-largest data center market, with the U.S. as the company’s number one market.

In terms of size, the data centers will be among the world’s biggest. The largest known data center complex is a 1.1-million-square-foot facility in Chicago owned by Digital Realty Trust, according to Data Center Knowledge, which has ranked the data centers by size.

rb-

Not the same Cloud City that Lando Calrissian ran.

 

Is your data center that big?

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.