Archive for Data protection

Today is World Backup Day

take responsibility for backing-up your data

March 31st is the annual worldwide World Backup Day. Now is a good time to check your backup systems. You have a plan right? Now that so many of us are working from home, we can’t rely on Jim the backup admin to automagically make the files that mysteriously disappeared all by themselves re-appear.

You should take responsibility for backing up your data. From the data the keeps your business moving to the personal information you share and store online, your devices hold the files, images, and conversations that matter most.

According to WorldBackUpDay.com  (Not HTTPS) World Backup Day. was founded by a few “concerned users” on Reddit. The day’s dedication is a decidedly serious one. March 31 was established as:

… a day for people to learn about the increasing role of data in our lives and the importance of regular backups.

What’s a backup?

Cloud backup

A backup is a full image copy of all the data stored on a device like your desktop, laptop, or tablet. By storing this second copy, everything on your device that matters to you is safe and accessible in the event of accidental deletion, system failure, or ransomware attack.

Why backup? Business continuity begins with backups, restoring data from those backups keeps the business up and running. At work, your devices store irreplaceable information. Unfortunately, it’s very easy and costly to lose data. Over, $600 billion are lost to cyber-crime each year, according (PDF) to a 2018 McAfee report.

Stay safe out there!

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Veeam Backup Bought

IVeeam Backup Boughtn a move to improve its U.S. market share, Veeam Software has agreed to be bought by private equity firm Insight Partners. The deal valued a $5 billion, is Insight’s second major acquisition of 2020. Veeam is cloud-focused data protection, backup, and disaster recovery software company.

Backup, and disaster recovery company.

Veeam logoVeeam was founded in 2006 and owned by Russians Andrei Baronov and Ratmir Timashev. The firm has grown to 365,000 customers worldwide and annual sales of more than $1 billion by capitalizing on the VMware-led server virtualization boom. As part of the take-over, the founders will leave the firm and Veeam will become a U.S. company based in New York. The company had been based in Baar, Switzerland.

Veeam’s products include backup solutions, cloud security offerings, and cloud data management. Veeam’s cloud data management portfolio consists of Veeam Backup for Amazon Web Services (AWS), Veeam Backup for Microsoft Office 365, Veeam Universal License (VUL), and Veeam Backup for Microsoft Azure.

Private equity plans

Veeam's products include backup solutionsThe private equity company has a three-stage program to help the companies in which it invests grow, including the Startup stage of focused on companies looking for early growth in their markets, the ScaleUp stage for companies with strong businesses, and the Corporate stage for companies ready for IPOs or other exits, Mike Triplett, a managing director of Insight Partners and new Veeam board member told CRN.

ZDNet says Veeam is in the second “ScaleUp” stage as customers are now also utilizing hybrid cloud setups with AWS, Azure, IBM, and Google, the firm’s “Act II” is to capitalize on a growing need for cloud data management across these environments. Mr. Triplett claims Insight Partners can bring the right resources to bear to move Veeam from the “ScaleUp” stage to the “Corporate” stage.

Other Insight Partners investments

Insight Partners has invested heavily in cybersecurity and MSP-friendly technology markets.Insight Partners also owns other data protection companies — including Unitrends and Spanning. In addition to data protection, the VC has invested heavily in cybersecurity and MSP-friendly technology markets. Other key Insight Partners investments include:

rb-

private equity firms and hedge funds have a bad reputationExpect to see lots of PE activity this year (decade?). Channele2e reports that private equity investors are sitting on a record $1.5 trillion in cash. This kind of war chest is no wonder private equity firms and hedge funds have a bad reputation. VC firms have a history of acquiring businesses, loading them up with debt, and cutting staff to boost profits. The most recent examples being Sears and Toys R Us. Channele2e points out that U.S. presidential candidate Elizabeth Warren is calling for new private equity restraints to combat “legalized looting.”

I have seen that Veeam has a Russian problem. Back in the day when I shared technical services, I tried to replace an HP LTO2 tape library (PDF) with a Veeam solution and the powers-that-were did not want Veeam  – we spent a lot more money to maintain the old HP LTO2 technology.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Black Hole Data

Black Hole Data The first image of a black hole was published on April 10, 2019. The black hole, M87* at the center of the Messier 87 galaxy is located 53 million light-years away from Earth. NASA says a black hole is an extremely dense object from which no light can escape. Anything that comes within a black hole’s “event horizon,” will be consumed, because of the black hole’s unimaginably strong gravity.

the first image of a black hole

By its very nature, a black hole cannot be seen, the bright ring in the picture is the event horizon, the point where an object approaching a black hole is unable to escape its gravitational pull. Objects that pass into the event horizon go through spaghettification, a process, first described by Stephen Hawking, where gravitational forces stretch the object out like a piece of pasta. The M87* image shows a silhouette of the black hole against the glow of the event horizon captured by researchers at the Event Horizon Telescope (EHT).

APEX Atacama Pathfinder Experiment antenna.The EHT is the brainchild of Shep Doeleman, the director of EHT and astronomer at the Harvard-Smithsonian Center for Astrophysics. It is a virtual global array of eight ground-based radio telescopes. The EHT captured around 3.5 PB of data for the black hole image in April 2017. It then took two years to correlate the data to form the image. The EHT team not only had to figure out intergalactic science but also massive information technology problems. The researchers had to solve IT problems pretty typical for enterprise IT professionals, only bigger.

According to an article at SearchDataBackup each EHT telescope can record data at a rate of 64 Gbps, and each observation period can last more than 10 hours. The author calculated that each site generated around half a petabyte of data per run. The distributed locations included volcanoes in Hawaii and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica. The sites were kept in sync using precise atomic clocks and GPS systems to carefully time the observations.

The data from each telescope was recorded at 16 Gbps and distributed among a total of 32 hard disk drives grouped into 4 modules of 8 disks each. The EHT can record a total rate at each site of 64 Gbps by using 4 units in tandem.

Sites making up the virtual Event Horizon Telescope.

 

One problem EHT ran into was the failure rate of traditional hard drives in the extreme telescope locations. ComputerWorld reports that 28 of 32 conventional hard drives failed at the Sierra Negra telescope, on the top of an extinct volcano in Mexico.

WD 10TB helium disk driveSearchDataBackup says the solution was helium hard drives. The hermetically sealed helium drives are self-contained environments, so they could survive the extreme environments in which EHT’s telescopes operated. EHT first deployed helium hard drives in 2015. EHT data scientist Lindy Blackburn told SearchDataBackup that EHT now uses about 1,000 helium drives with up to 10 TB of capacity from Western Digital, Seagate, and Toshiba. He told SearchDataBackup,

The move to helium-sealed drives was a major advancement for the EHT … Not only do they perform well at altitude and run cooler, but there have been very few failures over the years. For example, no drives failed during the EHT’s 2017 observing campaign.

The amount of data collected by EHT was too much to send over the Internet so the researchers went old-school and used FedEx sneakernet style to send the data to be processed. Geoffrey Bower an astronomer in Hawaii told ScienceNews that mailing the disks is always a little nerve-wracking. So far, there have been no major shipping mishaps. But the cost and logistics involved with tracking and maintaining a multi-petabyte disk inventory is also challenging. Therefore, EHT is always on the lookout for another method to move petabyte-scale data.

Cloud computing

`

 SearchDataBackup points out that normally the cloud would be a good option for long-term storage of unifying data sourced from multiple, globally distributed endpoints. However, Mr. Blackburn told them the cloud was not a cold storage option for the project. He said the high recording speed and the sheer volume of data captured made it impractical to upload to a cloud. He explained, “At the moment, parallel recording to massive banks of hard drives, then physically shipping those drives somewhere is still the most practical solution.”

The data collected on the helium hard disk drive packs were processed by a grid computer made of about 800 CPUs all connected through a 40Gbps network at the MIT Haystack Observatory MA, and the Max Planck Institute for Radio Astronomy in Germany.

Katie Bouman is the MIT student who developed the algorithm that pieced together the data from the EHT with disk drives

Geoff Crew, co-leader of the EHT correlation working group at Haystack Observatory told SearchDataBackup It is impractical to use the cloud for computing. Mr. Crew said;

Cloud computing does not make sense today, as the volume of data would be prohibitively expensive to load into the cloud and, once there, might not be physically placed to be efficiently computed.

The EHT scientists built algorithms that converted sparse data into images. They developed a way to cut the number of possible images by sorting out which results were physically plausible and which were wildly unlikely making it less hard to create the images.

The Haystack VLBI Correlator grid computer at the MIT Haystack Observator

Converting sparse data into images matters beyond astronomy. Mr. Blackburn told 538 the problem comes up in other areas as well; it occurs in medical imaging when doctors use MRIs to convert radio waves into pictures of your body. It’s also a key part of self-driving cars, which rely on computer visualization to “see” everything from potholes to people.

data protectionJust like any enterprise, EHT had to find a workable method of data protection. That includes deciding what won’t be protected. EHT has not found a cost-effective way to replicate or protect the raw radio signal data from the telescope sites. However, once the data has been processed and reduced to tens of petabytes it is backed up on-site on several different RAID systems and on Google Cloud Storage. Mr. Crew told SearchDataBackup;

The reduced data is archived and replicated to a number of internal EHT sites for the use of the team, and eventually, it will all be publicly archived. The raw data isn’t saved; we presently do not have any efficient and cost-effective means to back it up.

Mr. Blackburn said the raw data isn’t worth backing up. Because of the complexity of protecting such a large amount of data, it would be simpler to run another observation and gather a new set of data. Mr. Blackburn said; “Backing up original raw data to preserve every bit is not so important.”

Mr. Blackburn said he can’t seriously consider implementing a backup process unless it is “sufficiently straightforward and economical.

Instead, he said he’s looking at where technology might be in the next five or 10 years to find the best method to handle petabyte-scale raw data from the telescopes. Mr. Blackburn told SearchDataBackup;

Right now, it is not clear if that will be continuing to record to hard drives and using special-purpose correlation clusters, recording to hard drives and getting the data as quickly as possible to the cloud, or if SSD or even tape technology will progress to a point to where they are competitive in both cost and speed to hard disks

rb-

The image of the black hole validated Einstein’s general theory of relativity and proves that enterprise-class IT can solve intergalactic problems.

The EHT team had to figure out how to save, move and backup massive quantities of data and of course do more with less. EHT’s Geoff Crew summed up the problem most IT pros have; “Most of our challenges are related to insufficient money, rather than technical hurdles.”

Related articles
  • Trolls hijacked a scientist’s image to attack Katie Bouman. They picked the wrong astrophysicist. (MSN)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

The Evolution of Backup

The Evolution of BackupHave you ever stopped to think about how the technology for data protection has evolved? Backup has been around, in one form or another, since 3000 B.C. It has evolved and adapted to take advantage of improvements in technology platforms. Storage vendor Axcient traces the evolution of backup technology from clay tablets to the cloud in this infographic.

Axcient traces the evolution of backup and key events in backup methods.

Axcient infographic the evolution of backup

According to CrunchBaseAxcient is an entirely new type of cloud platform. Their technology stack eliminates data loss, keeps applications up and running, and makes sure that IT infrastructures never go down.

Axcient is designed for today’s always-on business, The system replaces legacy backup, business continuity, and disaster recovery software and hardware. They claim it reduces the amount of expensive copy data in an organization by as much as 80%.

By mirroring an entire business in the cloud, Axcient makes it simple to access and restore data from any device. They claim that with a single click their app can configure failover systems, and virtualize your entire office – all from a single deduplicated copy.

rb-

The key to any successful Business Continuity Plan is a solid, verified backup plan. The impact of a major data loss on a SMB can be devastating. The actual numbers are debatable, however, it seems that a significant number of firms go out of business after a major data loss. 

There are many new ways to backup your data, from Acronis, Axcient, Barracuda (CUDA), EMC (EMC), ExagridHP (HPQ), IBM (IBM), Symantec (SYMC), Veem what is important is that you have a plan, execute it and test it. 

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook and Twitter. Email the Bach Seat here.

Need Cyber Insurance?

Need Cyber Insurance?Standard business insurance does not cover data breaches or almost any other loss involving data. Standard insurance covers tangible losses and damage. Data isn’t tangible says Network World. The ruling that data is not tangible goes back to a 2000 ruling by a U.S. District Court. The article explains the ruling arose from an Arizona case, American Guarantee & Liability Insurance Co. vs. Ingram Micro Inc.. In that case, the court said that a computer outage caused by a power problem constituted physical damage within the meaning of the policy Ingram Micro had purchased from American Guarantee.

Courts ruled data is tangible propertyAfter that, the insurance firms changed their policies to state that data is not considered tangible property,Kevin Kalinich, national managing director for network risk at Aon Risk Solutions told Network World. The upshot is that an enterprise needs special cyber insurance to cover data-related issues. The problem is that the field is new and there is no such thing as standard coverage with a standard price.

Larry Ponemon, chairman of the Ponemon Institute, told Network World that the resulting complexity is a major source of push-back by potential buyers. “The policies have limitations and constraints similar to home policies with act-of-God provisions, and that has created a lot of uncertainty about what is covered, and what the risks are.” Mr. Ponemon told the author, “Those who are nevertheless purchasing cyber insurance are typically very selective about what coverage they want.”

Network World describes the types of cyber coverage available.

cyber coverages availableData breach coverage: This pays for expenses that result from a data breach. Covered expenses typically include notification of the victims, setting up a call center, credit monitoring, and credit restoration services for the victims, and other crisis management services, Ken Goldstein, vice president at the Chubb Group, told Network World. “You might want to hire forensic experts, independent attorneys for guidance concerning the multiple state (data breach notification) laws, and public relations experts.”

Regulatory civil action coverage: Pays in cases where the insured is facing fines from a state attorney general after a data breach, or from the federal government after a violation of the Health Insurance Portability and Accountability Act (HIPAA) or similar regulations. Some policies only cover the cost of defending against the action, while others may pay the fine as well, says Steven Haase, head of INSUREtrust, an Atlanta-based specialty insurance provider.

Cyber extortion coverageCyber extortion coverage: For cases where a hacker steals data from the policyholder and then tries to sell it back, or someone plants a logic bomb in the policy holder’s system and demands payment to disable it. Among other things, the policy should cover the cost of a negotiator, and the cost of offering a reward leading to the arrest of the perpetrator, Chubb’s Goldstein says.

Virus liability: Pays in cases where the policyholder is sued by someone who claims to have gotten a virus from the policy holder’s system.

Chubb logoContent liability: Covers lawsuits filed by people angered over something posted on the Web site of the policyholder. Such coverage should also cover copyright claims and domain name disputes, INSUREtrust’s Haase told Network World.

Lost income coverage: Replaces revenue lost while the policy holder’s computer system or Web site is down. But Aon’s Kalinich notes that insurers often apply minimum downtimes of 12 or 24 hours, or require proof of actual losses, “They’ll say that, after all, the customers who did not get through (during the outage) could have come back later.”

AON logoLoss of data coverage: Pays for the cost of replacing the policy holder’s data in case of loss, “Backup policies are not always effective, and accidents and sabotage happen,” Mr. Haase says.

Errors and omissions coverage: Otherwise known as O&M policies, this type of coverage predates cyber insurance, but is increasingly added to cyber policies to cover alleged failures by the policy holder’s software, Haase says.

Errors and omissions coverageAs for what coverage costs, Aon’s Kalinich told Network World that firms smaller than $100 million in annual revenue can expect to pay $5,000 to $15,000 per million of coverage, while larger firms would pay $10,000 to $25,000. For those over a billion, the price can be in the $20,000 to $50,000 range. Robert Parisi, senior vice president with Marsh, an insurance broker, and risk advisory firm put it simpler, saying the cost is between $7,000 and $35,000 per million. Of course, the lower ranges are for buyers who look like better risks — and deciding who is a better risk is another factor that makes cyber insurance a complex topic.

You cannot get good insurance unless you have good security practices,” VP Kalinich says. “Due diligence underwriting has become more streamlined as the insurers have learned what to look for. They will typically benchmark you against other members of your industry.

15% of the premium goes to commissionsINSUREtrust’s Haase explained the cyber insurance purchase process to the author, “This is a complex purchase and you need a professional helping you. Most policies are highly customizable, and there are a lot of endorsements.” Typically the buyer goes to their local agent, and the local agent uses a specialist, Haase says. Both the local agent and the specialist get commissions ranging from 7.5% to 10% so that 15% to 10% of the premium goes to commissions.

Finally, Toby Merrill, vice president of insurer Ace Professional Risk cautions that cyber insurance buyers must understand that if they are outsourcing their data handling, they are not at the same time outsourcing their liability if there is a data breach. The onus of the various breach notification laws is on the organization that gathered the data, not on the organization that was storing it when it was exposed, he notes.

Cyber insurance is not there to replace sound risk management,” VP Merrill told Network World, “It is there to supplement it.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.