Tag Archive for Networking

Server Sprawl

Data Center KnowledgeServer Sprawl reports on an interesting survey from Netcraft. Netcraft has developed a technique for identifying the number of computers (rather than IP addresses) acting as web servers on the internet. They can then attribute these servers to hosting locations through reverse DNS lookups. This provides an independent view with a consistent methodology worldwide on the numbers of web servers, the rate of growth over time, and the operating systems and web server technology used at each hosting company worldwide.

Through an analysis of public reports and the Netcraft server count, Data Center Knowledge developed a list of organizations with a large number of servers.

Number of servers

The Data Center Knowledge article goes on to speculate on the degree of server sprawl at some of the more secretive firms:

There’s a widely circulated estimate of 450,000 servers, but that number is at least three years old. If it was ever correct, it certainly isn’t anymore, given Google’s data center building spree. Google’s recently revealed container data center holds more than 45,000 servers, and that’s a single facility built during 2005.

There are actually some numbers on Microsoft’s server count, but it’s also dated. Screenshots from the company’s data center management software suggest that Microsoft was running about 218,000 servers in mid-2008. The company’s new Chicago container farm will hold up to 300,000 servers, so the count will change rapidly when that facility is deployed.

Amazon says very little about its data center operations, but we know that it bought $86 million in servers from Rackable in 2008, and stores 40 billion objects in its S3 storage service.

With more than 160 million active users between its online auction house and PayPal payment service and 443 million users on Skype, eBay has a massive data center infrastructure. The company houses more than 8.5 petabytes of data in huge data warehouses. We’re not certain what kind of server count this requires, but it’s certainly in the 50,000 club.

The third major search portal likely has more than 50,000 servers in operation to support its large free hosting operation as well as its paid hosting service and Yahoo Stores.

It’s the world’s largest domain registrar with more than 35 million domains under management, but effective cross-selling of its hosting plans has also made GoDaddy one of the largest shared hosting operations in the world. Its infrastructure is probably similar in scope to that of 1&1 Internet.

While server “ownership” is less distinct with system integrators, EDS has an enormous data center operation. Company documents say EDS is managing 380,000 servers in 180,000 data centers.

With more than 8 million square feet of data center space, IBM also houses an enormous number of servers in its data centers, both for itself and its customers.

Facebook says only that it has more than 10,000 servers, but it’s been saying that since April 2008 and it’s now serving 200 million users and hosting at least 40 billion photos. Facebook is clearly way beyond 10,000 servers.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFaceboo and Twitter. Email the Bach Seat here.

HDTV over Wi-Fi

HDTV over Wi-FiTelephonyOnline has an article speculating that wireless high definition television will be available this summer. Celeno Communications, an Israeli start-up backed by Cisco, manufactures Wi-Fi chips. Their semiconductors can make Wi-Fi networks robust enough to deliver multiple high-definition television (HDTV) streams to PCs, TV’s or other consumer electronics devices. Celeno’s technology would deliver on a significant part of the anywhere, anytime video promise.

Celeno’s OptimizAIR technology will work with existing receivers such as set-top boxes, uses the 5 GHz spectrum. OptimizAIR uses standard PHY and MAC layers. It uses proprietary algorithms that the company says can double the throughput of standard 802.11 Wi-Fi. It can also increase the range of the Wi-Fi signals as much as eight times. Celeno’s technology additions include Spatial Channel Awareness and Beam-Forming MIMO (multiple inputs, multiple outputs). The company said it can stream HD video 120 feet, through four brick walls and more than three floors.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

IBM Resurrects Broadband over Powerline

IBM Resurrects Broadband over PowerlineA NetworkWorld article proves that where there is money to be taken from the Federal Government, Never Say Never Again. According to the article, IBM (IBM) has started building out broadband over powerline (BPL) networks. The company says BPL could offer broadband connectivity to 200,000 people living in rural areas.

IBM is building out the BPL networksIBM is building out the Broadband over Powerline networks as part of a $9.6 million deal with International Broadband Electric Communications (IBEC). In 2008, IBM inked a deal with the Alabama-based broadband provider to expand broadband access to people living in rural areas. The companies plan to deploy BPL networks to serve areas that only have access to dial-up services. The BPL will be delivered through seven electric cooperatives in Virginia, Michigan, Alabama, and Indiana. Once working, IBEC will serve as the cooperatives’ official ISP.

Broadband over Powerline in Michigan

Bob Hance, CEO of Michigan-based Midwest Energy Cooperative, says his company decided to take part in the BPL network program after a customer survey. The survey results, Mr. Hance says, were overwhelmingly in favor of signing up for the broadband program. Within a week, the cooperative had a waiting list of 4,000 customers practically pleading for service. “We were amazed by the responses to the survey — thousands of letters from citizens of our community expressing their need for broadband in order to improve everything from childhood education to the future of their family-owned small businesses,” said Mr. Hance.

We shared nearly 600 of these letters with local legislators after we realized none of the major service providers were going to answer their calls for help. Thanks to the help of those legislators, IBM and IBEC were able to access the resources needed to help our community. In less than two weeks, we’ve already deployed 400 live miles with broadband access, or nearly 4,000 homes.” according to a 02-19-09 press release from IBM and IBEC.

Electric companies’ benefits

IBM says in addition to bringing broadband connectivity to under-served areas, the new BPL connectivity will benefit electric companies. The BPL rollout will increase electric companies’ ability to monitor, manage and control the reliability of their electrical grids. Currently, electric cooperatives serve roughly 12% of the population in the United States and provide about 45% of the electrical grid. The give-away American Recovery and Reinvestment Act of 2009 include $11 billion to be spent on “smart grid” systems to monitor and manage the nation’s electrical network.

Government handoutrb-

Of course, I may be overly cynical if I question the timing of the IBM announcement. It happened just 24 hours after the $787 billion give-away American Recovery and Reinvestment Act of 2009 was signed by President Obama. In case you didn’t find the five pages entitled Division B— Title VI–Broadband Technology Opportunities Program (pages 398-402 of 407 pages) they authorize the $7.2 billion to give-away stimulate the expansion of broadband networks into rural and underdeveloped areas in the country.

BPL so far has not caught on as a broadband technology in the United States. As of May 2008, there were only 4,776 people in the United States subscribed to broadband over powerline.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Terabit Ethernet

Terabit EthernetOver at The Register, there is an article heralding the coming of Terabit Ethernet. Apparently, researchers from Australia, China, and Denmark think they have opened the door to terabit per second Ethernet links using multiplexed 10Gbit/s data streams and small chalcogenide demux chips to demultiplex the 10 gig streams.

In the paper, entertainingly entitled Breakthrough switching speed with an all-optical chalcogenide glass chip: 640 Gbit/s demultiplexing, the researchers describe how injecting multiple 10gig data streams into optical cables is not a problem using existing optical technology (electro-optic modulator per stream) and optical time-division multiplexing (OTDM).

Recombining the data streams

The obstacle has been recombining those separate data streams at the end of the link and doing it fast enough. Apparently despite the recent hype about 40Gb Ethernet, the receiving and recombination of these streams is a problem at output rates higher than 40Gbits according to the research paper published in Optics Express, Vol. 17, Issue 4, on February 16th.

Until now the re-combination has been carried out using photo-detectors that can operate up to 40 Gbit/s or so. That limits us to just four 10gig streams. Achieving higher data rates this way means we have to send more parallel data streams down the cable and demultiplex – switch or recombine them – into one data stream faster still. This latest research uses waveguides just 5cm long by making them from chalcogenide glass chips with switching speeds measured in femtoseconds, a billionth of a millionth of a second, or a quadrillionth.

The researchers conclude that their test results confirm the enormous potential of chalcogenide-based waveguides for ultrafast optical signal processing.

They believe their technology can be extended to demultiplex 100 10Gbit/s data streams and so achieve a terabit Ethernet capability. The article points out that commercialization of such technology is, of course, if it takes place at all, many years away.

rb-

Seems like its time to add another synonym for huge to our vocabulary petabyte, exabyte, zettabyte

Some thoughts from Bob Metcalfe on TB Ethernet

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Multi-Gigabit Wireless by 2012

Multi-Gigabit Wireless by 2012A January 26, 2009, ScienceDaily article describes a new CMOS chip capable of transmitting 60 GHz digital RF signals. The new chip enables rapid wireless transfer of a high-definition movie from a PC to a cell phone. It was developed at the Georgia Institute of Technology‘s Georgia Electronic Design Center (GEDC).

There are many potential 60 GHz applications. Some applications are virtually wireless desktop computers, data centers, and wireless home DVD systems. The 60 GHz application would allow in-store kiosks that transfer movies to handheld devices in seconds. It also has the potential to move gigabytes of photos or video from a camera to a PC almost instantly.

Experts believe that this technology could yield high-speed, short-range wireless applications by 2012. According to Joy Laskar, director of the GEDC, “Consumers could see products capable of ultra-fast short-range data transfer within two or three years.” Ann Revell-Pechar, chair of the MIT Enterprise Forum of Atlanta Chapter says “Multi-gigabit wireless technology is widely perceived to bring important new wireless applications to both consumer and IT markets.” Darko Kirovski, senior researcher at Microsoft Research says “Multi-gigabit technology definitely has major promise for new consumer and IT applications.

Unprecedented short-range wireless speeds

Researchers have already achieved very high data transfer rates that promise unprecedented short-range wireless speeds-15 Gbps at a distance of 1 meter, 10 Gbps at 2 meters, and 5 Gbps at 5 meters.

The GEDC-developed chip is the first 60GHz embedded chip for multimedia multi-gigabit wireless use. According to Ms. Laskar, this new technology “represents the highest level of integration for 60GHz wireless single-chip solutions. It offers the lowest energy per bit transmitted wirelessly at multi-gigabit data rates reported to date.

Industry group Ecma International recently announced a worldwide standard for radio frequency (RF) technology that makes 60 GHz “multi-gigabit” data transfer possible. The specifications for this technology are expected to be published as an ISO standard in 2009.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.