Blog

Subscribe Here!

Why Visibility is Crucial to Data Archiving

by David Clements – July 12, 2019

If you’ve been following this blog over the last couple of months, you have likely heard of our new DataDiscover Test Drive offering and how it empowers IT departments to scan their toughest NAS systems--completely free of charge for 30 days.

That sounds “cool” to many people, but if you’re not in the type of role where you’re actively wishing you could just understand your filers a little better, it may not be obvious how data visibility translates into data archiving and thus business value and greater datacenter efficiency. This article, then, will walk through the biggest benefits our customers see when they first use DataDiscover to have fact-based conversations about their data: freeing up space on primary NAS, justifying storage spend, and keeping in compliance.

DataDiscover gives you an understanding of where your data is, how old it is, and how big it is. This visibility lets you have fact-based conversations about how to reclaim space on NAS systems. Generally, data archiving isn’t a hot topic, because it involves figuring out just how unimportant your less-important data is, downsizing a certain team’s hard-won primary-storage allocation, or figuring out which regulations apply where. But also, many people are prone to avoiding conversations around what to tier off of their primary NAS because the key facts are missing, making discussion difficult.

These facts are:

  • How long ago someone (or some app) used a particular dataset
  • Whether current budget and capacity projections are on track
  • Whether there is a “big, easy” archiving opportunity being ignored, while the team pores over smaller datasets
  • How much primary storage space could be reclaimed if a policy-based archive workflow were set up

As long as these facts are missing, any conversation about storage optimization will have little chance of actually impacting the business.

As a storage admin armed with these facts, however, you’ll have a much easier time sitting down with data owners, project managers, and anyone else who puts their data into IT-managed storage systems, and an easier time working with them to tier data to secondary storage and/or a cloud archive tier. In addition to optimizing which data is in your primary NAS, you’ll be able to make better-informed decisions about which data belongs onsite, and which belongs offsite or in a public cloud.

As a data owner or project manager being presented with these facts, you’ll gain a clearer understanding of where your storage usage is coming from--and might even end up with extra primary storage capacity!

As an end user, you probably won’t see the dashboard during the Test Drive. But your colleagues in IT will be able to give you concrete details about the datasets they’re suggesting you archive. You’ll be able to continue working without worrying about your primary storage filling up, or your most-needed datasets being unavailable when you need them.

And as an executive, whether CEO, CIO, or CTO, you’ll have confidence that your data is in the right place, that your IT department is spending their money efficiently, and that your team knows what’s going on with their storage and why.

Finally, knowing which of your data is “old and cold” is essential to staying on the good side of your legal team. Since any data on premises is discoverable in a lawsuit, many enterprises’ data archiving policies also stipulate mandatory deletion dates. For instance, if you’ve committed to deleting forever any data older than 5 years, you can use an intuitive web UI to find out which of your data meets that threshold--or conversely, to demonstrate that all of your data is within policy already.

If any of these outcomes would be interesting to you, your team, or your shareholders, don’t hesitate to start your Test Drive now and see how understanding your data is the first step towards better leveraging it --and how this step is easier than ever.

 

Try datadiscover

 

Comments