Archive for RB

Michigan Broadband Below Average

Michigan Broadband Below Average Ookla, the Seattle-based firm that runs www.speedtest.net website has just released a mountain of data at Netindex.com that identifies the uploads and download speeds for a myriad of locations across the globe. According to the website, the index compares and ranks consumer download test results from Speedtest.net. The value is the rolling average throughput in Mbps over the past 30 days where the mean distance between the client and the server is less than 300 miles. The results are not good for the US or Michigan.

As of 08-01-10, the Global household download index is 7.61 Mbps and the US  household download index is  8.88 Mbps. The United States  ranks 27th, in the world wand trails countries as:

The results for Michigan are equally disappointing. The Michigan download index is 8.02 Mbps, below the US download index number. Michigan ranks 35th. Michigan ranks 31st for upload speeds. The US national upload index is 2.14 Mbps Michigan’s is 1.62 Mbps.

Most Michigan citys’ download speeds pale when compared internationally. Niles Michigan ranked 1st in Michigan for download speed of 18.41 Mbps, but nowhere near the best speed available in Seoul, South Korea. The following table lists the top-performing Michigan cities and compares them to the international competition as well as major Michigan cities.

LocationMbps - Download
Seoul, South Korea31.59
Bucharest, Romania22.72
Vilnius, Lithuania19.29
Niles, Michigan18.41
Cebu, Philippines17.73
Sault Sainte Marie Michigan17.23
Amsterdam, Netherlands16.16
Oxford, Michigan15.75
Omsk, Russia15.17
Big Rapids, Michigan15.12
San José, California14.41
Marquette, Michigan10.62
Ann Arbor, Michigan10.52
Lansing, Michigan10.50
Grand Rapids, Michigan9.3
Based on millions of recent test results from Speedtest.net, this index compares and ranks consumer download speeds around the globe. The value is the rolling average throughput in Mbps over the past 30 days where the mean distance between the client and the server is less than 300 miles.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Apple Has Most Holes

Apple Has Most HolesSecurity company Secunia is reporting that Apple (AAPL) software has the most security vulnerabilities. According to the recent Secunia Half Year Report 2010 (PDF) Apple has displaced Oracle as the company with the most security vulnerabilities in its software over the first half of 2010. Microsoft retains its third-place spot.

WiredApple logo points out that this does not necessarily mean that Apple’s software is the most insecure in practice. The report takes no consideration of the severity of the flaws, it points at a growing trend in the world of security flaws: the role of third-party software. Many of Apple’s flaws are not in its operating system, Mac OS X, but rather in software like Safari, QuickTime, and iTunes. Vendors like Adobe (with Flash and Adobe Reader) and Oracle (with Java) are similarly responsible for many of the flaws being reported. The top ten third-party applications, ranked by total number of reported vulnerabilities:

1. Mozilla Firefox
2. Apple Safari
3. Sun Java JRE
4. Google Chrome
5. Adobe Reader
6. Adobe Acrobat
7. Adobe Flash Player
8. Adobe AIR
9. Apple iTunes
10. Mozilla Thunderbird

Secunia logoTo illustrate this point, ars technica says the report includes cumulative figures for the number of vulnerabilities found on a Windows PC with the 50 most widely used programs. Five years ago, there were more first-party flaws (in Windows and Microsoft’s other software) than third-party. Since about 2007, the balance shifted towards third-party programs. Secunia predicts that third-party flaws will outnumber first-party flaws by two-to-one by the end of 2010.

Secunia also makes a case that effectively updating third-party software is much harder to do; because Microsoft’s Windows Update and Microsoft Update systems will offer protection for around 35% of reported vulnerabilities, patching the rest requires the use of 13 or more updating systems. Some vendors—Apple, Mozilla, and Google, for example—do have decent automatic update systems, but others require manual intervention by the user.

Steve Jobs

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Intel Shows TBps Connections

Intel Shows TBps ConnectionsThe EETimes reports that researchers at Intel Corp. (INTC) have demonstrated optical chips can transmit up to terabit-per-second of data transmission. The new silicon photonic chips will replace copper connections in everything from supercomputers to servers to PCs chips predicts Intel. The new chips can currently transmit data at 50 Gigabits per second (Gbps). 50 Gbps equates to transferring an HD movie a second.

This milestone marks the beginning of silicon photonics in the high-volume marketplace, in applications from [high-performance computing] all the way down to the client PC,” said Mario Paniccia, director of Intel’s Photonics Technology Lab. “We see a clear development path from 50 Gbps today to a terabit in the future,” Mr. Paniccia told EETimes.

Intel says that optical connections could eventually replace the copper connections between systems and even between boards in the same system and down to cores on the same board. intel’s Paniccia estimated that the first commercial applications of silicon photonics will begin appearing in as little as five years in data centers and supercomputer facilities.

The modulators required to encode optical information using signal waveguides and photodiodes are cast in silicon on custom chips designed by Intel. The transmitter chip uses Intel’s hybrid silicon laser technology that bonds a small indium phosphide die to on-chip silicon waveguides, four of which are patterned into a connected optical laser.  “We combined our silicon manufacturing techniques with our hybrid laser and demonstrated an integrated transmitter using four lasers each operating at a different wavelengths and four silicon modulators each operating at 12.5 Gbps, then combined them together into an aggregate 50 Gbps into the optical fiber,” said Paniccia.

The optical fiber output on the receiver chip is then filtered into separate colors and diverted by waveguides into four separate photodiodes, each of which receives one of the four separate 12.5-Gbps channels. In the future, Intel plans to add more lasers per chip and increase the number of channels. Intel believes that it can put 25 lasers on a single chip to produce the 1 Tbps capabilities. It then hopes to commercialize the optical connection technology.  Intel has been developing the technology since 2004.

Intel already has a 10-Gbps Light Peak chip that uses conventional optical technologies that are aimed at reducing the number of port connections on a computer. The Silicon Photonics Link is different from Light Peak technology. Intel’s Light Peak technology – an optical cable that is aimed at reducing the number of port connections on a computer. said it used traditional optical devices and scaling it beyond 10 Gbps speeds would be difficult.

rb-

For some perspective, the 1 terabit per second link could transfer the entire printed collection of the Library of Congress in 1.5 minutes.

Intel is preaching high bandwidth and low cost with these chips. If Intel can deliver, it could change the nature of system design. Theoretically, these chips could allow system components to the spaced further apart without the performance hit. With these chips, data center expansion could be down the hall instead of a full re-design. Now it may be cheaper to take the new gear to the available electrical panel rather than adding a new panel to the server room.

Intel’s Paniccia told VentureBeat that the accuracy of the data transfer is superb. So far, it has been proven to be able to transfer data with no errors for 27 hours straight, which means it can transfer more than a petabyte of data without an error.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

More Dell Hardware Woes

– Updated 10-08-10 – Dell has settled the lawsuit which claimed the computer manufacturer hid computer defects. The New York Times reports that Dell settled the suit (09-23-2010) brought by Advanced Internet Technologies in Federal District Court in North Carolina. The terms of the tentative settlement were not disclosed.

In the NYT article, Clarence E. Briggs III, chief executive for Advanced Internet, in Fayetteville, NC, declined to comment about the settlement, as did his lawyer. David S. Frink, a spokesman for Dell, in Round Rock, TX, told the NYT “settling the matter is better and more cost-effective for the company than taking the case to trial.”

– Updated 08-15-10 – The New York Times is reporting that Advanced Internet Technologies (A.I.T) is accusing Dell of withholding evidence in their lawsuit, including e-mails among its top executives including Michael Dell, in a filing made Thursday. According to the NYT, A.I.T. filed a motion in Federal District Court in North Carolina asserting that Dell had deliberately violated a court order by failing to produce documents written by its executives, including the company’s chief executive and founder, Michael S. Dell.

In its filing, A.I.T. asserted that Dell had provided only a snippet of the communications among top executives about the faulty computer problems. The NYT says A.I.T. argued that Dell must have had more high-level communications than a “talking points” memorandum sent to Mr. Dell and Kevin Rollins, then the chief executive.

Larry E. Daniel, a digital forensics expert, has filed an affidavit in the case, stating that the handful of messages Dell provided appeared altered and incomplete according to the NYT article. Mr. Daniel suggested that Dell should provide access to the underlying e-mail files rather than cutting and pasting text.

More Dell Hardware WoesHuman error is to blame for the latest Dell hardware gaffe. PCWorld is reporting that a sequence of errors led to Dell’s delivery of motherboards with malware.  On 7-21-10, Dell said that some replacement motherboards for PowerEdge servers may have contained the W32.Spybot worm in flash storage. The malware issue affected a limited number of replacement motherboards in four servers, the PowerEdge R310, R410, R510, and T410 models, according to an email from Forrest Norrod, vice president and general manager of server platforms at the Round Rock, Texas firm.

A sequence of human errors

Dell logoThe company confirmed on 7-21-10  it is in the process of overhauling its testing procedures to resolve issues before sending hardware to customers. “There was a sequence of human errors that led to the issue, That being said, we have identified and implemented 16 additional process steps to make sure this doesn’t happen again,” said Dell spokesperson Jim Hahn.

Hahn did not provide more details to PCWorld on the steps being added to track and resolve such issues. But he said that all affected motherboards had been removed from the service supply chain. Dell is quick to point out that current anti-virus software with updated signatures would flag the malware’s presence and users would have to be running an unpatched version of Windows 2008 or an earlier version of the OS to be vulnerable.

PCWorld cites a Dell quality management specialist who wrote in an e-mail that the code was accidentally introduced during the manufacturing process of the server motherboards. “This flash is the one that holds your BIOS and it can be updated online. If proper security precautions are not in place, the flash chip is every bit as capable of containing a piece of malware as is the hard-disk drive,” according to Jim Handy, director at Objective Analysis, a semiconductor research company in PCWorld.

Simha Sethumadhavan, assistant professor of computer science at Columbia University told PCWorld that this incident shows how hardware, either flash or a processor if hacked, can be used as a way to transmit malware. “All software runs on the hardware. If the processor is hacked then it can subvert all software countermeasures. Since hardware is the root of trust, attacks on hardware are potentially more dangerous.”

Other Recent Dell issues include:

  • According to the New York Times, Dell is being sued for shipping at least 11.8 million OptiPlex computers from May 2003 to July 2005 that were at risk of failing because of the faulty capacitors. A study by Dell found that OptiPlex computers affected by the bad capacitors were expected to cause problems up to 97 percent of the time over a three-year period, according to the lawsuit.  Making problems worse, Dell replaced faulty motherboards with other faulty motherboards. The NYT points out that Dell employees went out of their way to hide these problems. In one e-mail exchange, a Dell worker states, “We need to avoid all language indicating the boards were bad or had ‘issues’ per our discussion this morning.” In other documents, Dell salespeople were told, “Don’t bring this to customer’s attention proactively” and “Emphasize uncertainty.”
  • 2010 Dell announced it was setting aside a $100 million reserve for the first quarter of fiscal 2011, related to a potential settlement with the U.S. Securities and Exchange Commission. The SEC began investigating Dell in 2005 over accusations of misleading auditors and fabricating financial information, which allowed the company to exaggerate its performance. Dell has already restated some of its financial results reported before 2007. it is reported that founder and CEO Michael Dell faces a separate fine totaling $4 million. “Accuracy and completeness are the touchstones of public company disclosure under the federal securities laws,” said SEC enforcement director Robert Khuzami. “Michael Dell and other senior Dell executives fell short of that standard repeatedly over many years, and today they are held accountable.”
  • 2010 Dell announced that the company and chairman and CEO, Michael Dell, have proposed settlements to the staff of the US Securities and Exchange Commission (SEC) over claims of illegal accounting practices. It is reported that the original case and investigation dates back to 2006 when Dell employees misled auditors and manipulated results to meet performance targets.
  • 2010 A federal appeals court reinstated a class-action lawsuit accusing Dell of selling defective notebook computers. The lawsuit alleges that Dell Inspiron notebooks bought between July 2004 and January 2005 had inadequate cooling systems, power supplies, and motherboards which caused the notebooks to shut down without warning, fail to boot up or deteriorate too quickly. (Reuters)
  • 2009 The New York Times and IDC confirmed that Acer overtook Dell as the Number 2 PC maker during the third quarter of 2009.
  • In 2008 A New York judge concluded that Dell engaged in repeated false and deceptive advertising of its promotional credit financing and warranties according to the New York Times.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Cows Can Power Your Next Server Farm

ComputerWorld reports that HP (NYSE: HPQ) researchers presented a paper (PDF) on using manure from cows to generate power to run data centers. HP says that manure from dairy farms. cattle feedlots and other “digested farm waste” can be used to generate electricity.

HPHP presented the idea to the American Society of Mechanical Engineers Conference on Energy Sustainability, The researchers believe that biogas from a farm of 10,000 dairy cows could power a 1 megawatt (MW) data center, about 1,000 servers. That is the equivalent of a small bank’s computer center.

Organic matter is already used by farms to generate power. Farmers use a process called anaerobic digestion that produces methane-rich biogas. HP’s paper looks at how the process could be extended to run a data center, starting with the amount of manure produced by your typical dairy cow and working up from there.

Connecting a data center to cows

But there are some practical problems. The first problem is connecting a data center to the cows. “What’s the reality of getting 10,000 cows in one place?” said Angie McEliece, an environmental consultant for RCM International in Berkeley, CA, which makes digester systems. She told ComputerWorld the average size dairy farm in the U.S. includes less than 1,000 cows. farms with 5,000 cows are quite unusual. Farms that now use anaerobic digestion systems to generate electricity and heat typically get some funding from federal and state grants. In such cases, a payback of four years or less on the technology is likely. 10 years is the payback to me without grants, said Ms. McEliece in the ComputerWorld article.

Cows Can Power Your Next Server Farm

HP insists that this is just an idea sketched out on paper by a research team. No demonstration project has yet been planned. “I’ve not yet submitted a purchase order for cows,” said Tom Christian, an HP researcher, in an e-mail to ComputerWorld. “The idea of using animal waste to generate energy has been around for centuries, with manure being used every day in remote villages to generate heat for cooking.

The new idea that we are presenting in this research is to create a symbiotic relationship between farms and the IT ecosystem. The new tech can benefit the farm, the data center, and the environment according to Tom Christian, principal research scientist, Sustainable IT Ecosystem Lab, HP.

rb-

The proposal has energy independence, economic and ecological benefits.

Michigan had 335,000 cows in 2007.  According to the HP researchers, the manure that one dairy cow produces in a day can generate 3.0 kilowatt-hours (kWh) of electrical energy. Michigan dairy cows could produce enough methane to move 366.825 MWh off the grid under this plan. That would be enough electrical power to move all of Facebook’s estimated 30,000 servers off of the grid.

Economic benefits

There are economic benefits as well. Data center operators would have access to a reliable source of clean energy, presumably at a competitive if not lower cost than what’s on the market. Dairy farmers would make money selling electricity to data center customers. HP estimates that dairy farmers would break even within the first two years. They could earn roughly $2 million annually from selling the power to data center customers. Michael Kanellos, at Greentech Media, told the New York  Times that there was some convenient overlap between data centers and biogas generation. “Computing equipment produces a lot of heat as a waste product, and the systems needed to create biogas require heat. So, there is a virtuous cycle of sorts possible.”

Another trend that makes this idea workable is the move to build facilities in rural locations. In areas where high-speed networks are available, they can benefit from the cost advantages of rural areas. Many agricultural areas are also ideal for wind farms. Leading to a second clean energy source that could lead to some economic revival in the U.S.

Alternate energy sources such as these can help prepare for a new round of regulation and taxes. For example the U.S.s’ Waxman Markey bill. Carbon taxes or cap-and-trade systems both in the U.S. and abroad will force companies to measure and report greenhouse gas emissions. Farmers will benefit from the proposed system by accumulating carbon offsets for capturing and reusing methane.

There are also environmental benefits. A system that extracts biogas from manure would cut the hefty environmental impact of animal waste. The HP paper says methane is 21 times more damaging to the environment than carbon dioxide. Additionally, farmers will benefit from carbon offsets. They could be eligible to receive credits for capturing and reusing methane under any future cap-and-trade emissions legislation.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.