Archive for RB

Sears Converts Stores to Data Centers

-Updated 07-12-16- Data Center Frontier reports that Sears ultimately decided to spin off its Sears and Kmart stores as a real estate investment trust (REIT) rather than converting them into data centers.

Sears Converts Stores to Data CentersThe blinking blue lights of servers soon fill the aisles that previously offered the Blue Light Special according to an article in Data Center Knowledge by Rich MillerSears Holdings (SHLD) has formed a new unit to market space from former Sears and Kmart retail stores as a home for data centers, disaster recovery space, and wireless towers.

Ubiquity Critical EnvironmentsWith the creation of Ubiquity Critical Environments, Sears hopes to convert the retail icons of the 20th century into the Internet infrastructure to power the 21st-century digital economy. The article says Sears Holdings has one of the largest real estate portfolios in the country, with 3,200 properties spanning 25 million square feet of space. That includes dozens of closed Sears and Kmart stores. Sean Farney, the COO of the newly formed Ubiquity believes the firm has a great asset on its hands he told DCK.

It’s an amazing real estate portfolio … The goal is not to sell off properties. It’s to reposition the assets of this iconic brand. The big idea is that you have a technology platform laid atop a retail footprint, creating the possibility for a product with a very different look to it.

SearsCOO Farney is an industry veteran who previously managed Microsoft’s huge Chicago data center, and then ran a network of low-latency services for the financial services firm Interactive Data. He told DCK, he sees an opportunity to build three lines of businesses atop the Sears portfolio: data centers, disaster recovery sites and “communications colocation” in which Ubiquity leases rooftop space to wireless providers.

Ubiquity will be able to leverage real estate at both closed stores and some that are still operating, depending on the opportunity. The first step has been to evaluate the portfolio and identify properties that could work as data centers. The article reports that Chicago engineering firm ESD has conducted “data center fitness tests” on promising properties to size up their power, fiber, and risk profiles. Ubiquity is also working with Newmark Grubb Knight Frank to market the portfolio to the brokerage community.

Data centerThe first Ubiquity project will be a Sears store on the south side of Chicago, nestled alongside the Chicago Skyway. The 127,000 square foot store will be retrofitted as a multi-tenant data center. Ubiquity’s Farney says he already has a commitment for the first tenant at the site on East 79th Street, which has 5 megawatts of existing power capacity and the potential to expand. “It’s a building that’s lit very well, from both a fiber and power perspective,” Mr. Farney told the author. “It’s going to be great data center building.”

Mr. Farney acknowledges that many of Sears’ mall-based retail locations aren’t viable for data center usage. “I don’t think the industry is yet ready for a mall-based data center,” he said. “That may take some time. The stand-alone location is optimal.”

Cell towerUbiquity has those stand-alone facilities, along with distribution centers and some parcels of vacant land. ”There are closed Kmarts that are stand-alone, 200,000 square-foot properties with good fiber and power and 10 acres of parking,” said Mr. Farney. “These are owned assets.”

The article cites the COO who says Ubiquity has flexibility in how it works with tenants. It could finance a buildout and then hand over a wholesale data center to an enterprise or managed hosting provider or could opt for a powered shell solution for a tenant, depending on the customer’s needs.

After initially focusing solely on data centers, Ubiquity has expanded its strategy Mr. Miller explains. Although mall-based stores may not be right for data centers, they could be ideal for disaster recovery facilities, Mr. Farney said. That includes mall stores that have closed, as well as those that have downsized to a smaller retail footprint. In either scenario, a separate workspace could be created with an exterior entrance to restrict access, while still allowing employees to take advantage of nearby stores and eateries. Mr. Farney believes this makes sense for the client.

Disaster recovery sitesThere are compelling reasons why this is a great model … It used to be the business continuity centers were located in an industrial park. The customer has evolved to the point where they want a sexier location, where they can have access to a Starbucks and other retail, because it’s possible they may be there for weeks or months. Sears and Kmart stores are located in just such retail locations in major malls.

The COO also predicts that customers are ready for a more distributed approach to business continuity.

In the past, customers had a single monolithic recovery center … Now, after (Hurricane) Sandy, there’s a need for multiple locations, because you don’t be tied to one location in a regional disaster. There’s a desire to have multiple locations spread costs across multiple areas. The Sears footprint really fits that.

Then there’s wireless, which the article says is the most exciting opportunity. Mr. Farney says that seventy percent of the U.S. population lives within 10 miles of a Sears or Kmart store.

When malls were being built, they gravitated to the intersection of freeways and highways, and Sears got entry to all of them … These rooftops have proximity to the greatest mass of consumers available. As wireless users grow, the size of the cell is shrinking, creating holes in coverage. Having rooftop access to the cars and pedestrians around the malls is important. The Sears portfolio can capture that … There’s tons of interest. I will put as many of the rooftops in play as I can.

 rb-

This is a rather innovative and out-of-the-big-box thinking and smart use of space for a company with a huge real estate portfolio. 

Sears’ solution to the problem of now-vacant retail buildings isn’t to sell them off for scrap and hope for the best but to hang on to its assets and find a way to make them more profitable. Every struggling company and town in this country could learn a lesson from Sears.
Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

The Evolution of Backup

The Evolution of BackupHave you ever stopped to think about how the technology for data protection has evolved? Backup has been around, in one form or another, since 3000 B.C. It has evolved and adapted to take advantage of improvements in technology platforms. Storage vendor Axcient traces the evolution of backup technology from clay tablets to the cloud in this infographic.

Axcient traces the evolution of backup and key events in backup methods.

Axcient infographic the evolution of backup

According to CrunchBaseAxcient is an entirely new type of cloud platform. Their technology stack eliminates data loss, keeps applications up and running, and makes sure that IT infrastructures never go down.

Axcient is designed for today’s always-on business, The system replaces legacy backup, business continuity, and disaster recovery software and hardware. They claim it reduces the amount of expensive copy data in an organization by as much as 80%.

By mirroring an entire business in the cloud, Axcient makes it simple to access and restore data from any device. They claim that with a single click their app can configure failover systems, and virtualize your entire office – all from a single deduplicated copy.

rb-

The key to any successful Business Continuity Plan is a solid, verified backup plan. The impact of a major data loss on a SMB can be devastating. The actual numbers are debatable, however, it seems that a significant number of firms go out of business after a major data loss. 

There are many new ways to backup your data, from Acronis, Axcient, Barracuda (CUDA), EMC (EMC), ExagridHP (HPQ), IBM (IBM), Symantec (SYMC), Veem what is important is that you have a plan, execute it and test it. 

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook and Twitter. Email the Bach Seat here.

70s Glitch Could Hit Every Computer On Earth

70s Glitch Could Hit Every Computer On The PlanetRebecca Borison at the BusinessInsider asks who remembers the 1999 panic about the Y2K crisis. In 1999, Y2K looked as if it might derail modern life when computers because the glitch would reset computers to Jan 1. 1900, rather than Jan. 1, 2000, because computers only used two digits to represent a year in their internal clocks.

déjà vu all over againNow it déjà vu all over again, BI reports there’s a new, even bigger global software coding fiasco looming.  A huge amount of computer software could fail around the year 2038 because of issues with the way the code that runs them measures time.

Once again, just like with Y2K every single piece of software and computer code on the planet must now be checked and updated again. That is not a trivial task according to the author. In 2000, we bypassed the Y2K problem by recoding the software explains Ms. Borison. All the software — a fantastically laborious retrospective global software patch.

Disruption to the tech industry

Y2K problemAlthough Y2K was not a disaster, it was a massive disruption to the tech industry at the time. Virtually every company on the planet running any type of software had to find their specific Y2K issue and hire someone to fix it. Ultimately, Y2K caused ordinary people very few problems — but that’s only because there was a huge expenditure of time and resources within the tech business.

The 2038 problem will affect software that uses what’s called a signed 32-bit integer for storing time. The problem arises because 32-bit software can only measure a maximum value of 2,147,483,647 seconds. This is the biggest number you can represent using a 32-bit system.

time is represented as a signed 32-bit integerWhen a bunch of engineers developed the first UNIX computer operating system in the 1970s, they arbitrarily decided that time would be represented as a signed 32-bit integer (or number), and be measured as the number of milliseconds since 12:00:00 a.m. on January 1, 1970.

Glitch says it’s 1970 again

On January 19, 2038 — 2,147,483,647 seconds after January 1, 1970 — these computer programs will exceed the maximum value of time expressible by a 32-bit system using a base 2 binary counting system, and any software that hasn’t been fixed will then wrap back around to zero, thinking that it’s 1970 again.

UNIX time coding has since been incorporated widely into any software or hardware system that needs to measure time.

BI spoke with Jonathan Smith, a Computer and Information Science professor at the University of Pennsylvania for confirmation. The professor confirmed the Year 2038 is a real problem that will affect a specific subset of software that counts on a clock progressing positively. He elaborated:

Most UNIX-based systems use a 32-bit clock that starts at the arbitrary date of 1/1/1970, so adding 68 years gives you a risk of overflow at 2038 … Timers could stop working, scheduled reminders might not occur (e.g., calendar appointments), scheduled updates or backups might not occur, billing intervals might not be calculated correctly

The article concludes that we all need just to switch to higher bit values like 64 bits, which will give a higher maximum. In the last few years, more personal computers have made this shift, especially companies that have already needed to project time past 2038, like banks that need to deal with 30-year mortgages.

64 bitsApple (AAPL) claims that the iPhone 5S is the first 64-bit smartphone. But the 2038 problem applies to both hardware and software, so even if the 5S uses 64 bits, an alarm clock app on the phone needs to be updated as well. (If it’s using a 32-bit system in 2038 it will wake you up in 1970, so to speak.) So the issue is more of a logistical problem than a technical one.

HowStuffWorks reports that some platforms have different dooms-days.

  • IBM (IBM) PC hardware suffers from the Year 2116 problem. For a PC the beginning of time starts at January 1, 1980, and increments by seconds in an unsigned 32-bit integer in a way like UNIX time. By 2116, the integer overflows.
  • Hardware and softwareMicrosoft (MSFT) Windows NT uses a 64-bit integer to track time. However, it uses 100 nanoseconds as its increment and the beginning of time is January 1, 1601, so NT suffers from the Year 2184 problem.
  • On this page, Apple states that the Mac is okay out to the year 29,940!

rb-

The tech industry’s response to Y2K suggests that they will mostly ignore the 2038 issue until the very last minute when it becomes to ignore.  Another example of the pace of global software updates is that a majority of ATM cash machines were still running Windows XP, and thus vulnerable to hackers even though Microsoft discontinued the product in 2007.

Dont worryFortunately, the 2038 problem is somewhat easier to fix than the Y2K problem. Well-written programs can simply be recompiled with a new version of the C-library that uses 8-byte values for the storage format. This is possible because the C-library encapsulates the whole time activity with its own time types and functions (unlike most mainframe programs, which did not standardize their date formats or calculations). So the Year 2038 problem should not be nearly as hard to fix as the Y2K problem was.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Memorial Day 2014

ThanksMemorial day 2014

 

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

BYOD: My Phone Your Problem

BYOD: My Phone Your ProblemFujitsu warns that BYOD programs have a lot of hidden costs that IT departments often do not consider according to a recent article on FierceMobileIT. Craig Merrick, the managing consultant for mobile business solutions at Fujitsu (6702), explains the sources of extra costs of the BYOD program.

oftware updates to smartphones could cause problemsThe enterprise can incur significant additional costs if it tries to support all versions of operating systems being used by BYOD employees. Mr. Merrick says software updates to smartphones could cause problems with existing corporate applications. This could lead to the help desk being overwhelmed with calls.

BYOD support costs

He cites a recent survey of 25,000 BYOD end users by Fujitsu found that 80% of users believe that their corporate IT department is responsible for fixing issues with their personal devices.They want to bring their own device but they don’t want to take responsibility for fixing it,” Fujitsu’s Merrick said. Gartner (IT) forecasts that supporting BYOD will cost enterprises $300 per employee annually by 2016, up from a current $100 per employee annually.

storing corporate information on personal devicesAnother area of unforeseen cost, according to the article is a security breach caused by BYOD. A survey (PDF) of 790 IT professionals by Dimensional Research on behalf of security firm Check Point found that 79% of respondents reported they had a mobile security incident within the past year. Many of these incidents stemmed from employees storing corporate information on personal devices.

Mobile security incidents

The report revealed that more than half of large businesses reported mobile security incidents that have cost them more than $500,000. For 45% of SMB, mobile security incidents exceeded $100,000 in the past year, the survey found. Tomer Teller, security evangelist and researcher at Check Point commented;

Without question, the explosion of BYOD, mobile apps, and cloud services has created a herculean task to protect corporate information for businesses both large and small.

protect corporate information for businessesThe article concludes that additional costs for firms contemplating BYOD, can include network infrastructure upgrade, wireless service costs, device management product investment, and application and software investments, explained Forrester (FORR) analyst Michele Pelino.

rb-

Many businesses believe that implementing a BYOD policy will save them both the capital outlay of acquiring devices and the ongoing cost of maintaining them. But the reality does not always match the theory.  Planning and implementing a successful BYOD program requires executives to understand the costs, as well as the benefits.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.