2 min read

Data Growth Is Unpredictable — but Here's How You Can Plan for It

By Catherine Chiang on January 30, 2018

According to Gartner, 70% of organizations list managing data growth as their top concern. With data growing exponentially while legacy storage systems struggle to handle petabyte-scale data, it’s no easy feat for enterprise IT to balance modern data management needs with budgets that aren’t growing as quickly as their data.

Even if you believe that your enterprise has a long-term plan for managing your data growth, changes in the way we generate and use data are already resulting in unpredictable and explosive data growth that’s harder and harder to manage.

Data Growth is Unpredictable

Moore’s Law, named after Intel cofounder Gordon Moore, predicted that chip performance could double about every two years. However, the industry is reaching the limits of Moore’s Law while 63% of enterprises report 20% or higher annual data growth rates—meaning that data is still growing, but storage density isn’t keeping pace.

In addition, enterprises need to not only take into account how their data is growing today, but changes in how data is generated and used. Advancements in machine learning and artificial intelligence are likely to improve the way businesses use data, but also create data storage and management problems for businesses that may not yet struggle with large amounts of unstructured data.

For example, unpredictable changes in how data is generated in the life sciences industry resulted in an explosion of genomic data. As technological developments made sequencing equipment more affordable and widespread, labs began generating genomic data faster than IT could add capacity.

The way we generate and use data is changing across all industries, making it imperative for enterprises to plan for the unpredictable when it comes to data growth.

Scale-Out, Not Up

So how do you plan for data growth that's both unpredictable and potentially explosive?

One way to future-proof your storage and data management systems against unpredictable data growth is to incorporate scale-out architecture into your infrastructure. Rather than the traditional way of scaling vertically (“scale-up”), which was to increase capacity by adding more devices to your existing system, scale-out, or scaling horizontally, increases capacity by connecting multiple nodes that work as a single system.

While the emphasis has been on scale-out primary systems, scale-out secondary storage systems also have many benefits over scale-up for large unstructured data.

With traditional scale-up storage, once the system reaches its limit, the only way to add capacity and performance is to add a new, separately managed system. This results in silos that add unnecessary complexity to your infrastructure.

Scaling out means that as storage capacity increases, so does performance. Since all the data can stay in one system as it scales, enterprises can eliminate their backup silos and streamline their secondary storage even as data grows.

If you’d like to learn more about how we’ve built Igneous Hybrid Storage Cloud to handle unstructured data at scale, check out our technical whitepaper, “Igneous Scale-Out Architecture.”

Read whitepaper

Catherine Chiang

Written by Catherine Chiang

Subscribe for Updates

Get the latest Igneous blog posts delivered to your inbox.