Tag Archive for Backup

Today is World Backup Day

take responsibility for backing-up your data

March 31st is the annual worldwide World Backup Day. Now is a good time to check your backup systems. You have a plan right? Now that so many of us are working from home, we can’t rely on Jim the backup admin to automagically make the files that mysteriously disappeared all by themselves re-appear.

You should take responsibility for backing up your data. From the data the keeps your business moving to the personal information you share and store online, your devices hold the files, images, and conversations that matter most.

According to WorldBackUpDay.com  (Not HTTPS) World Backup Day. was founded by a few “concerned users” on Reddit. The day’s dedication is a decidedly serious one. March 31 was established as:

… a day for people to learn about the increasing role of data in our lives and the importance of regular backups.

What’s a backup?

Cloud backup

A backup is a full image copy of all the data stored on a device like your desktop, laptop, or tablet. By storing this second copy, everything on your device that matters to you is safe and accessible in the event of accidental deletion, system failure, or ransomware attack.

Why backup? Business continuity begins with backups, restoring data from those backups keeps the business up and running. At work, your devices store irreplaceable information. Unfortunately, it’s very easy and costly to lose data. Over, $600 billion are lost to cyber-crime each year, according (PDF) to a 2018 McAfee report.

Stay safe out there!

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Why Don’t Users Protect Themselves

Why Don't Users Protect ThemselvesA new report (PDF) from recently swallowed and swallowed again Webroot, says that American technology users overestimate their levels of cyber hygiene. Cyber hygiene is a cybersecurity risk mitigation technique introduced by Vinton Cerf in 2000 where you train yourself to think proactively about your cybersecurity. The goal is to resist cyber threats and online security issues to protect and maintain IT systems and devices and implement cybersecurity best practices, just as you do with your daily personal hygiene.

Webroot logoThe report says U.S. users do not know how to protect themselves from cyber threats. Americans are overconfident in the perceived protection they have. The endpoint security and threat intelligence provider found that 88% of interviewed Americans believe they are taking the appropriate steps to protect themselves from cyber-attacks.

Their confidence is misplaced. Instead, Americans have only a surface-level understanding of the most common types of cyber threats according to Webroot. We can recognize some of the names of the most common cyber-attacks such as malware (79%) or phishing (70%), but for most, that’s where their knowledge ends. Very few (less than 1 in 3) actually know what these common cyber-attacks are or what they do.

While Americans claim to have heard of some of the most common cyber-attack terms when prompted, very few actually understand what those cyber-attacks are. When asked about critical cyber-hygiene issues like malware, backups passwords, and identity theft surveyed Americans reported:

20% update their AV software regularlyMalware – 79% have heard of malware, but only 28% can confidently explain what it is. 82% are using some sort of AV software on their personal devices. 62% of those who use AV software use a free product. Only 20% update their AV software each time they are prompted.

Backups – are another weakness. 78% of respondents report backing up their data. However, 57% are still leaving themselves susceptible to risk by only backing up using one method, rather than backing up online (cloud) and offline.

  • 22% rarely or never backup their data.34% Automatically backup to the cloud
  • 27% Backup to an external hard drive
  • 24% Backup to a USB stick
  • 22$ backup locally on My Computer
  • 17% backup manually to the cloud
  • 22% rarely or never back up their data.

Among those who are backing up their information by uploading it to the cloud, only 43% are taking the extra step in ensuring that it’s stored in an encrypted format.

33% of Americans admit to sharing their passwordsPasswords – Followers of Bach Seat know that passwords suck and the Webroot report confirms it. 33% of Americans admit to sharing their passwords with others. To make matters worse, 63% are reusing passwords across multiple accounts. The research found that Americans have on average 9 passwords for 17 accounts.

Mobile – While on the go, 67% of Americans use public Wi-Fi, but only 35% take the extra step to protect themselves by using a VPN. Additionally, 34% use a work device as their primary personal device at home.

Identity theft 74% of Americans believe their identity stolen has never been stolen.

According to the Webroot whitepaper, the 5 most cyber risky U.S. states are:

  1. Mississippi most cyber risky stateMississippi
  2. Louisiana
  3. California
  4. Alaska
  5. Connecticut

The 5 least risky U.S. states are

  1. New Hampshire least cyber risky stateNew Hampshire
  2. North Dakota
  3. Ohio
  4. Idaho
  5. Kentucky

rb-

According to the research conducted by Wakefield for Webroot, Michigan ranked 31 among the 50 states. Overall, the average home user scored a 60% for cyber-hygiene. The researchers also found that those who they classified as “Superstars” tended to be:

  • A Boomer
  • Married or in a relationship
  • Suburbanite
  • Not a parent.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Black Hole Data

Black Hole Data The first image of a black hole was published on April 10, 2019. The black hole, M87* at the center of the Messier 87 galaxy is located 53 million light-years away from Earth. NASA says a black hole is an extremely dense object from which no light can escape. Anything that comes within a black hole’s “event horizon,” will be consumed, because of the black hole’s unimaginably strong gravity.

the first image of a black hole

By its very nature, a black hole cannot be seen, the bright ring in the picture is the event horizon, the point where an object approaching a black hole is unable to escape its gravitational pull. Objects that pass into the event horizon go through spaghettification, a process, first described by Stephen Hawking, where gravitational forces stretch the object out like a piece of pasta. The M87* image shows a silhouette of the black hole against the glow of the event horizon captured by researchers at the Event Horizon Telescope (EHT).

APEX Atacama Pathfinder Experiment antenna.The EHT is the brainchild of Shep Doeleman, the director of EHT and astronomer at the Harvard-Smithsonian Center for Astrophysics. It is a virtual global array of eight ground-based radio telescopes. The EHT captured around 3.5 PB of data for the black hole image in April 2017. It then took two years to correlate the data to form the image. The EHT team not only had to figure out intergalactic science but also massive information technology problems. The researchers had to solve IT problems pretty typical for enterprise IT professionals, only bigger.

According to an article at SearchDataBackup each EHT telescope can record data at a rate of 64 Gbps, and each observation period can last more than 10 hours. The author calculated that each site generated around half a petabyte of data per run. The distributed locations included volcanoes in Hawaii and Mexico, mountains in Arizona and the Spanish Sierra Nevada, the Chilean Atacama Desert, and Antarctica. The sites were kept in sync using precise atomic clocks and GPS systems to carefully time the observations.

The data from each telescope was recorded at 16 Gbps and distributed among a total of 32 hard disk drives grouped into 4 modules of 8 disks each. The EHT can record a total rate at each site of 64 Gbps by using 4 units in tandem.

Sites making up the virtual Event Horizon Telescope.

 

One problem EHT ran into was the failure rate of traditional hard drives in the extreme telescope locations. ComputerWorld reports that 28 of 32 conventional hard drives failed at the Sierra Negra telescope, on the top of an extinct volcano in Mexico.

WD 10TB helium disk driveSearchDataBackup says the solution was helium hard drives. The hermetically sealed helium drives are self-contained environments, so they could survive the extreme environments in which EHT’s telescopes operated. EHT first deployed helium hard drives in 2015. EHT data scientist Lindy Blackburn told SearchDataBackup that EHT now uses about 1,000 helium drives with up to 10 TB of capacity from Western Digital, Seagate, and Toshiba. He told SearchDataBackup,

The move to helium-sealed drives was a major advancement for the EHT … Not only do they perform well at altitude and run cooler, but there have been very few failures over the years. For example, no drives failed during the EHT’s 2017 observing campaign.

The amount of data collected by EHT was too much to send over the Internet so the researchers went old-school and used FedEx sneakernet style to send the data to be processed. Geoffrey Bower an astronomer in Hawaii told ScienceNews that mailing the disks is always a little nerve-wracking. So far, there have been no major shipping mishaps. But the cost and logistics involved with tracking and maintaining a multi-petabyte disk inventory is also challenging. Therefore, EHT is always on the lookout for another method to move petabyte-scale data.

Cloud computing

`

 SearchDataBackup points out that normally the cloud would be a good option for long-term storage of unifying data sourced from multiple, globally distributed endpoints. However, Mr. Blackburn told them the cloud was not a cold storage option for the project. He said the high recording speed and the sheer volume of data captured made it impractical to upload to a cloud. He explained, “At the moment, parallel recording to massive banks of hard drives, then physically shipping those drives somewhere is still the most practical solution.”

The data collected on the helium hard disk drive packs were processed by a grid computer made of about 800 CPUs all connected through a 40Gbps network at the MIT Haystack Observatory MA, and the Max Planck Institute for Radio Astronomy in Germany.

Katie Bouman is the MIT student who developed the algorithm that pieced together the data from the EHT with disk drives

Geoff Crew, co-leader of the EHT correlation working group at Haystack Observatory told SearchDataBackup It is impractical to use the cloud for computing. Mr. Crew said;

Cloud computing does not make sense today, as the volume of data would be prohibitively expensive to load into the cloud and, once there, might not be physically placed to be efficiently computed.

The EHT scientists built algorithms that converted sparse data into images. They developed a way to cut the number of possible images by sorting out which results were physically plausible and which were wildly unlikely making it less hard to create the images.

The Haystack VLBI Correlator grid computer at the MIT Haystack Observator

Converting sparse data into images matters beyond astronomy. Mr. Blackburn told 538 the problem comes up in other areas as well; it occurs in medical imaging when doctors use MRIs to convert radio waves into pictures of your body. It’s also a key part of self-driving cars, which rely on computer visualization to “see” everything from potholes to people.

data protectionJust like any enterprise, EHT had to find a workable method of data protection. That includes deciding what won’t be protected. EHT has not found a cost-effective way to replicate or protect the raw radio signal data from the telescope sites. However, once the data has been processed and reduced to tens of petabytes it is backed up on-site on several different RAID systems and on Google Cloud Storage. Mr. Crew told SearchDataBackup;

The reduced data is archived and replicated to a number of internal EHT sites for the use of the team, and eventually, it will all be publicly archived. The raw data isn’t saved; we presently do not have any efficient and cost-effective means to back it up.

Mr. Blackburn said the raw data isn’t worth backing up. Because of the complexity of protecting such a large amount of data, it would be simpler to run another observation and gather a new set of data. Mr. Blackburn said; “Backing up original raw data to preserve every bit is not so important.”

Mr. Blackburn said he can’t seriously consider implementing a backup process unless it is “sufficiently straightforward and economical.

Instead, he said he’s looking at where technology might be in the next five or 10 years to find the best method to handle petabyte-scale raw data from the telescopes. Mr. Blackburn told SearchDataBackup;

Right now, it is not clear if that will be continuing to record to hard drives and using special-purpose correlation clusters, recording to hard drives and getting the data as quickly as possible to the cloud, or if SSD or even tape technology will progress to a point to where they are competitive in both cost and speed to hard disks

rb-

The image of the black hole validated Einstein’s general theory of relativity and proves that enterprise-class IT can solve intergalactic problems.

The EHT team had to figure out how to save, move and backup massive quantities of data and of course do more with less. EHT’s Geoff Crew summed up the problem most IT pros have; “Most of our challenges are related to insufficient money, rather than technical hurdles.”

Related articles
  • Trolls hijacked a scientist’s image to attack Katie Bouman. They picked the wrong astrophysicist. (MSN)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Super-Sized Storage Saves Tape

Super-Sized Storage Save TapeThe LTO Program Technology Provider Companies (TPCs) recently announced the extension of the LTO tape product to generations 9 and 10. SearchStorage says that Linear Tape-Open (LTO) is an open-format tape storage technology. LTO was developed by Hewlett-Packard (HPQ), International Business Machines (IBM), and Certance. (Quantum (QMCO) acquired Centance in 2004). The term “open-format” means that users have access to multiple sources of storage media products that will be compatible and save tape backups from being replaced.

LTO Tape Backups

SearchStorage reports that the LTO tape vendors plan to grow the technology to super-size. LTO-9 will offer up to 25 TB of native capacity and LTO-10 will offer 48 TB. Transfer rates will increase over earlier generations. LTO-9 and LTO-10 will offer transfer rates of 708 MBps and 1,100 MBps, respectively make tape backups faster.

LTO Roadmap

The new generations will allow your to keep your existing tape backups. The new LTO will include read-and-write backwards compatibility with tapes from the previous generation. It also has read compatibility from the previous two generations. The new generations will also continue to support LTFS, WORM functionality and encryption.

LTO GenerationProduct shippedStorage capacity (TB)*Transfer Rate (MBps)*Compatible withNotes
LTO-12000.120LTO-1
LTO-22003.240LTO-1
LTO-32005.480LTO-2 & 1
LTO-42007.8120LTO-3 & 2
LTO-520101.5140
LTO-4 & 3
LTO-620122.5160LTO-5 & 4Current Standard
LTO-72015?6.4315LTO-6 & 5Development
LTO-82017?12.8472LTO-7 & 6Development
LTO-9TBD26708LTO-8 & 7Development
LTO-10TBD481100LTO-9 & 8Development

Another super sized storage option

In case you are not a LTO user, FierceCIO reports that Sony (SNE) has developed super-sized storage tape. The Sony magnetic tape cassette capable of storing 185TB of data by optimizing its nano-technology process.

Tape messSony optimized its “sputter deposition” technology to create a soft magnetic layer, allowing it to shrink magnetic particles,  on the storage layer to an average size of 7.7nm, and increasing density according to the article. This allows the Japanese firm’s forthcoming cassettes will be able to store 74 times more data than conventional tape media or the equivalent of 3,700 Blu-ray discs.

The creation of a 185TB cassette will no doubt be welcomed by large enterprises as they try not to be overwhelmed by the explosion in big data. Various studies estimate that in the next decade the amount of data stored will increase by 50 times. IDC predicts in 2020, over 40 trillion gigabytes of data will be stored around the globe.

rb-

Not so fast, these developments are not the holy grail of backup’s.

LibraryI know of several organizations that have dragged their fiscal feet and are still running LTO-1 or LTO-2.  They have limited their own upgrade path. Right there in the LTO.org spec’s it says that LTO only allows for support of the previous two generations of cartridges on LTO Tape Drives.

FierceCIO speculates that after cost, Sony’s biggest challenge with a 185TB tape will be making it sufficiently fast in terms of its read and write performance, and the possible need for non-conventional peripheral interconnects so that data backups can be completed within increasingly decreasing backup windows.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers and anything else that catches his attention since 2005. You can follow him at LinkedInFacebook and Twitter. Email the Bach Seat here.

Cloud Storage, Back-Up Bust

Cloud Storage, Back-Up BustIt is heady times fans of cloud storage these days. In case you haven’t been paying attention, the cloud file storage corner of the IT universe has heated up for the past couple of months. Dropbox, Microsoft (MSFT), Google (GOOG) and Apple (AAPL) have been leapfrogging each other in an apparent effort to buy mind-share.

Dropbox recently announced that its Dropbox Pro plan will now offer 1TB of capacity for $9.99 a month, or $99 for a full-year subscription. Paul Mah at FierceCIO says this is a significant reduction, especially when recent monthly Dropbox Pro storage prices were:

  • Dropbox logo$9.99 for 100GB,
  • $19.99 for 200GB, and
  • $49.99 for 500GB of storage.

Mr. Mah, says the latest move by Dropbox allows them to stay on par with the latest price cuts from Apple iCloud in September, Google Drive in June, and Microsoft OneDrive in May.

In September Apple updated its porous iCloud storage plans. CNET says the basic 5 gigabytes of storage remains free, though prices for paid tiers were significantly reduced and larger storage options were made available. CNET says the new monthly iCloud storage costs are:

  • Free for 5GB,
  • $0.99 for 20GB,
  • $3.99 for 200GB,
  • $9.99 for 500GB and
  • $19.99 1TB

Previously, 10GB of storage would have cost $20 per year, 20GB for $40 per year, and 50GB for $100 per year.

At Microsoft, the cloud-based file storage game also changed. According to Redmond Magazine, the improvements include a new file load limit (10GB max), an easier way to share links to OneDrive folders, and support for folder drag-and-drop operations using the Google Chrome browser. Microsoft is also working on speeding up the synchronization of files with OneDrive. The updated per month prices for OneDrive are:

  • Microsoft One DriveFree for 15GB,
  • $1.99 for 100GB,
  • $3.99 for 200GB,
  • $5.99 1TB

In an attempt to trump MSFT, Google released Google Drive for Work, a paid service targeted at business users and priced at $10 per user per month. FierceCIO noted that the new service offers unlimited storage, the ability to upload files of up to 5TB in size, and access to productivity apps such as Docs, Sheets, Slides, Hangouts, and Sites. Importantly, Google also announced that files uploaded to Google Drive can be encrypted, and will stay that way while in transit or when at rest on its servers. Here are the current prices per month for Google Drive space pace according to CNET:

  • Google DriveFree for 15GB,
  • $1.99 for 100GB,
  • $9.99 for 1TB,
  • $99.99 10TB,
  • $199.99 for 20TB and
  • $299.99 for 30TB.

Mr. Mah argues that price drops are good news for consumers. The extra space would certainly be useful for users who rely on it for long-term file archives or backing up large local files. The author correctly argues that 1TB of online storage does not deliver the same value to business users. The reason is simple: cloud storage is a terrible backup solution for large volumes of data, especially if you need to get it back quickly.

Mr, Mah observed that cloud storage vendors do not share information about any guaranteed uploading or downloading speeds when using them. This is noteworthy considering that 1TB of files can take a really long time to transfer over the Internet.

He explains that downloading 1TB worth of files with zero data overhead–which is impossible, across a reasonable 10Mbps broadband connection would take over 222 hours, or close to 10 days of continuous downloading. You can be assured that real-life conditions on your broadband connection would likely mean that this is at least doubled or even tripled.

And that’s assuming that the cloud service provider isn’t experiencing any congestion on its end, which is not something that cloud vendors are offering any guarantees on. Notwithstanding that, you can check out this nifty online calculator.

So while there is no question about the value of cloud storage for data synchronization across multiple devices, it is important for businesses to understand that the cloud just isn’t ideal for data backup. Mr. Mah concludes that users should use their 1TB of cloud space for all its worth, but users and firms need to do proper local backups for important files, as well as those that need to be restored quickly.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.