3 min read

Breaking Vendor Lock-In with Enterprise-Scale Backup to Cloud

By Shaun Kehrberg on July 15, 2020

With many enterprise IT organizations now managing petabytes of data from billions of files, it’s not surprising that 70 percent of respondents to our data survey in December 2019 said managing unstructured data is difficult with today’s tools.

How do you manage that much data in a multi-vendor, multi-generational storage environment? For many, the answer has been a patchwork of vendor-specific tools and services to backup each type and tier of storage - adding costs and increasing infrastructure complexity for IT teams.

While the cost and complexity of on-premises backup has continued to increase, new SaaS backup to cloud offerings for the enterprise and the decreasing cost of storage on all three major cloud storage providers has made enterprise backup to cloud cheaper than most on-prem backup solutions. 

That’s why now is the perfect time to weigh alternatives to traditional on-premises backup. Major cost savings can be found by moving enterprise-scale backup datasets away from on-premises targets to cloud targets, with the added benefit of reducing IT complexity by eliminating the patchwork of vendor-specific backup solutions for each storage tier.

Six Keys to Enterprise-Scale Backup to Cloud Success

As the public and private cloud industry matures, more and more companies have been embracing cloud computing to increase their agility and decrease costs. However, before adopting cloud storage for backup workflows, there are six best practices to consider to ensure success. Get these six things right, and you’ll be on your way to a smooth transition to enterprise-scale backup to cloud.

1. Write Backup Data Directly to Archive Tiers

The cost of not pushing data to archive tiers is the difference between around $21 per terabyte per month to store data in hot tiers and $1 per terabyte per month to store the data in archive tiers. If you write the data first to a hot tier, then move it to an archive tier later, that can incur additional transaction costs, which can add up over time. Instead, opt for a backup solution that natively leverages archive tiers to keep storage costs low and avoid transaction costs of later moving data between tiers.

2. Minimize Cloud Transaction Costs

Many vendors promise cloud compatibility but don’t optimize for – or even consider – transaction costs. This is an unfortunate mistake when moving NAS data to a cloud tier because these costs (called PUTs) can add up as the number of transactions grows. Your datacenter-to-cloud solution should minimize these costs by design. Make sure you ask your cloud backup vendor to factor transaction costs into the total cost of ownership (TCO) for your solution so you can verify they optimize around this expense

3. Intelligently Expire Data

Deciding when to expire data often requires careful consideration of legal and financial requirements outside your control. However, cloud providers often set mandatory retention periods dictating how long data must be kept in one of their archive-class storage offerings before they can be expired. For the AWS Glacier Deep Archive, the minimum is three months; for Azure Archive Blob Storage, it’s six months. Deleting data before meeting these thresholds could result in costly penalties from the cloud provider. Before putting your NAS backups in the cloud, consider how your data retention policy will be enforced or whether you’ll be subject to penalties for deleting data too soon.

4. Know When to Clean Up Expired Data

While data expiration involves business policies, reclaiming space in your cloud storage is about optimizing costs. Once you’ve solved the compliance problem, the data still needs to be deleted from these archive tiers after expiration. To be effective, a storage solution involving archive tiers should operate with business logic stipulating when to reclaim capacity (so that capacity doesn’t grow unchecked). Data shouldn’t be automatically deleted after expiration.

5. Restore Data Cost-Effectively

The ability to easily restore data provides welcome reassurance that you can get your data back when you need it. Most restore operations initiated by business users involve directories or individual files so you’ll need a solution that can restore just what’s requested, nothing significantly more than that will be efficient. You’ll also need the ability to restore data to any storage tier, including both on-premises NAS appliances and file systems in the cloud.

6. Move Data Quickly and Securely to the Cloud

Keep in mind that moving data quickly to the cloud requires the use of a direct connection to your preferred cloud storage system. All three major cloud vendors – Amazon, Google and Microsoft – offer a direct connection to their cloud. Typically billed per hour, these direct connections are usually available in 1G, 10G, or sometimes 100G options

For a deeper dive into the six best practices to minimize costs and optimize performance for enterprise-scale backup to cloud workflows, download our complimentary 20-page eBook on the topic “Enterprise Backup To Cloud: A Playbook For Cost-Effective Implementation”. 

Shaun Kehrberg

Written by Shaun Kehrberg

Subscribe for Updates

Get the latest Igneous blog posts delivered to your inbox.