Eclipse Authors: Pat Romanski, Elizabeth White, Liz McMillan, David H Deans, JP Morgenthal

Related Topics: Eclipse

Eclipse: Blog Post

The Trouble with Tape

What's the Best Backup Solution for You?

Tape backups are expensive, require a lot of manual intervention, leave gaps in protection and have high failure rates. If you can’t afford to lose time and data, software-based replication and restoration may be a better solution.

Is tape right for you?

Tape backup used to be the only option for backing up servers. However, tape backup has inherent problems that can go quickly from inconvenient to disastrous. Consider some of the issues and how they would affect your business in a disaster or system outage:

  • Tape backup hardware and software are expensive, especially if you have multiple offices.
  • Making backups every day requires manual intervention - it’s easy to forget or skip it.
  • Tape backup always involves downtime - you can’t backup a system that is in use.
  • Tapes are easily damaged, lost or destroyed.
  • At best, you’ll be recovering from yesterday’s data.
  • 40% of restoration attempts from tape fail - can you afford to permanently lose your data?
  • Tape restoration, when it works, involves hours or days of complete downtime.

The costs of tape backup

  • Acquisition and ongoing maintenance of hardware.
  • Acquisition of backup software and ongoing maintenance/support.
  • Acquisition and replacement of tape media.
  • Offsite storage and transportation costs.
  • Operation costs for performing backup and recovery.
  • Cost of downtime incurred during recovery.
  • Cost of data loss due to recovering to previous night’s data.

What will downtime cost you?

When considering a backup and recovery strategy, it’s helpful to determine your cost of downtime. (Large businesses average $42,000 per hour, small businesses average $18,000.) Here’s a simple way to estimate the average cost per hour of downtime: Cost per Occurrence = (To + Td) x (Hr + Lr).

In this equation, To is the length of an outage, Td represents how long it’s been since the last backup, Hr is the hourly rate of personnel (calculate by monthly expense per department divided by the number of work hours), and Lr is the lost revenue per hour, which applies if the department generates profit. A good rule is to look at profitability over three months and divide by the number of work hours.

How much can you afford to lose?

It is also helpful to determine how much time and data you can afford to lose, called RPO and RTO.

Recovery Point Objective

The first, the Recovery Point Objective (RPO), is the threshold of how much data you can afford to lose since the last backup. Defining your company’s RPO typically begins with examining how frequently backup takes place. Since backup can be an intrusive to systems, it is not typically performed more frequently than several hours apart. This means that your backup RPO is probably measured in hours of data loss.

Recovery Time Objective

The second, the Recovery Time Objective (RTO), is the threshold for how quickly you need to have an application’s information restored. For example, maybe 4 hours, 8 hours, or the next business day is tolerable for e-mail systems. Keep in mind the amount of time it takes to provision servers, storage, networking resources and virtual machine configurations. Using these two primary measures will help you understand your cost of downtime, help define a budget for an IT system continuity plan and determine the technology that meets your needs within your budget.

An affordable alternative

Once you’ve estimated your cost of downtime and how much data and time you can afford to lose, you can make an informed decision about which backup strategy is right for you. If tape backup is too expensive or leaves too much risk in your plan, a software-based solution is a better choice.

Good software-based replication and recovery solutions are affordable and easy to maintain. A comprehensive solution continuously copies changes to critical data and applications to a target server, eliminating downtime and data loss. In an outage, you can choose to manually or automatically failover to your target server - where a complete and up-to-the minute copy of your data and applications is always waiting. When it’s time to restore, the backup server will transfer everything back to the original or new production server - over standard networks, and without regard to hardware differences.

A comprehensive software package leaves no protection gaps, needs no reminders or intervention, and doesn’t require a rocket scientist for installation and set up. Besides being more comprehensive and reliable than tape, software-based backup and recovery is more affordable. Consider this:

Cost benefits of software-based backup and recovery:

  • Reduces tape backup hardware costs by consolidating backups.
  • Reduces tape media costs by reducing backup frequency.
  • Reduces operational costs of performing tape backup.
  • Reduces the cost of downtime from hours or days to minutes or hours.
  • Reduces the cost of data loss to near zero.
  • Reduces fines and penalties for regulated organizations by providing a complete up-to-the-minute record of data and transactions.
  • Reduces your organizations carbon footprint by allowing you to consolidate sites and servers.

With many excellent solutions on the market, it’s easier and more affordable than ever to make the switch from tape backup to software-based backup. Find a comprehensive solution from a company with a good reputation for customer support and your new backup and recovery plan will pay for itself in a few weeks.


More Stories By Robin Howard

Robin Howard is a Technology Anthropologist & Writer for Vision Solutions.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...