4 min read

Thoughts from AWS Re:Invent 2016

By Steve Pao on December 5, 2016

Perspectives on hybrid cloud for large, unstuctured data that we share with Amazon, plus some differences in opinion informed by our customers.

Last week, a small team of us went to the AWS Re:Invent 2016 conference that featured the latest announcements from Amazon Web Services (AWS).

At Igneous, we are both a consumer and a partner of AWS.

As a consumer, we utilize AWS both in our development pipeline as well as in production to run the Igneous Cloud (our hyperscale remote management platform for our equipment deployed at customers’ premises).

As a partner, we recognize that many of our customers and prospective customers are pursuing a hybrid cloud strategy. In addition to workloads that involve smaller datasets, we’re also hearing from customers that want to utilize public clouds for offsite redundancy for portions of their data they store on-premises, sharing processed results with parties outside their enterprises, and for bursting compute elastically. (In general, storage for large, unstructured data doesn’t really "burst" — it just grows monotonically!)

At Re:Invent, it was clear from the presentations that the AWS team was also seeing the same trends with large, unstructured data that motivated us to start Igneous.

The picture below was from the "Deep Dive" session on Amazon S3, demonstrating an overwhelming interest in utilizing object storage.


Beyond just the interest in object storage were some good discussions of trends driving hybrid cloud for large, unstructured data:

Based on these observations, AWS made a number of announcements.  As an AWS partner, we’re interested in pursuing AWS Greengrass, as we believe event-driven computing models are right for data-centric computing. And like with AWS Snowball Edge, we believe that storage should also embed compute. All that said, these AWS solutions make the “all-in” assumption that the data all eventually goes to the AWS cloud — even if it is so large that it has to be physically transported via parcel service or via a dedicated truck with AWS Snowmobile.

While these announcements are interesting as AWS continues to evangelize an "all-in" strategy, even those who view themselves as "cloud first" are utilizing a hybrid cloud strategy, combining public cloud with their enterprise data centers.

At Igneous, our aim is to provide options for customers running hybrid clouds without requiring the physical movement of data via truck or parcel service.  By managing large, unstructured data in enterprise data centers, enterprises can continuously run their data pipelines without having their data go offline while in transit.  By providing True Cloud for Local Data, Igneous Data Service can serve as both an on-ramp and a control point for data that is managed across enterprise data centers and even multiple cloud providers.

For context, here was some media converage of Igneous during the Re:Invent conference.

If you’re interested in learning more, contact us!

Steve Pao

Written by Steve Pao

Steve is CMO at Igneous. Prior to Igneous, Steve was an early executive through two IPOs – as VP of Product Management at Latitude Communications (now part of Cisco) and as SVP and GM of the Security Business at Barracuda Networks.

Subscribe for Updates

Get the latest Igneous blog posts delivered to your inbox.