Preparing for the Next Wave of Data: Storage Experts Weigh In

Posted NOVEMBER 03, 2016 by . Filed under Development, Releases.

In today’s “Third Age of Data,” scalability and real-time analytics have become more essential than ever for companies who want to meet business demands and stay ahead of the curve. While increasing data being generated by an estimated 1 trillion sensors across networked resources provide a wealth of opportunities for enterprises to discover new insights, they also come with unprecedented challenges. As a result, businesses are seeking ways to do more with less.

This doesn’t mean that they should simply use less hardware, but rather that they will be best positioned when their technical environments are intentionally designed to support their specific workflow. Because data is being created at such a rapid pace, enterprises need to be very strategic when designing their environment.

To understand the intent and strategy of your enterprise storage environment, we asked a number of storage industry experts to weigh in on the following question:

“What is the one thing enterprises should do in order to prepare for the next wave of data?”

 

Doug Black
Twitter: @EnterpriseTek
EnterpriseTech

Storage Expert Doug Black

With apologies for violating the express, written instructions for answering this question (“the one thing enterprises should do…”) – the right approach for handling hyper-scale growth in machine data is to cover many fronts simultaneously. It’s a broad, multifaceted problem. Organizations must deal with many variables based on what functional area a particular enterprise is trying to support and how the data must be used and managed. Hybrid storage strategies are likely a necessity for many (much as they are now) and storage isn’t the only issue for dealing with the data flood. Data security, availability, mobility, IO requirements are all factors besides scalability. Object storage is likely to grow in importance (no file system). Data reduction also is likely to be a huge need (selective retention and compression) – it’s imperative that organizations have strategies in place for getting rid of data that can be discarded after analytics is completed. Prioritizing one (or two) of these challenges at the expense of the others is not a formula for IoT machine data management success.


Tom Coughlin
Twitter: @thomascoughlin
Digital Storage Technology NewsletterStorage Bytes, Forbes

Storage expert Tom Coughlin

Machine-generated data from increasing electronic intelligence built into industrial infrastructure as well as consumer devices will put significant pressure on cloud and edge computing resources; including processing, network bandwidth, and storage capacity.  Finding, managing and processing this data to meet individual, corporate and social needs will dominate the information technology industry for at least the next 10 years.


Greg Schulz
Twitter: @StorageIO
StorageIO

Storage Expert Greg Schulz

Organizations need to put on their data life jackets, prepare their data infrastructure boats because streams of data flowing into that river of little and big data are about to cause data lakes and ponds to flood overflowing their banks on the way to the gulf of data and the sea of big and little data. Serious, now is the time to prepare getting your data infrastructure (hardware, software, services, process, practices) in place, cleaned-up, remove overhead and complexity, find and fix problems so you can cost support effective scaling, leverage new and old things in new ways including data footprint reduction (DFR) upstream at the source as well as downstream, otherwise, not even an object enabled data ark with RAIN resiliency will keep you afloat when the data clouds bursts and subsequent storms arrive.


Randy Kerns
Twitter: @rgkerns
Evaluator Group

Storage Expert Randy Kerns

Enterprise IT must understand the magnitude of the challenge for the influx of non-traditional data. Not only must they plan for massive capacity demands but also recognize that traditional data protection processes are not practical and that the data may have a very long lifespan as new insights may be gained with new tools and applications. Enterprise IT needs to develop a strategic plan for dealing with machine data.


Arun Taneja
Twitter: @ArunTanejaGroup
Taneja Group

Storage Expert Arun Taneja

Very simply put, enterprises need to evaluate and settle on an architecture that is microscopically designed to deal with this mass of data in all dimensions (volume, variety, and velocity). Traditional architectures will have a hard time dealing with one or more of these dimensions. That means customers will build silos again and that is a death knell for productivity, especially in IoT-type environments. These new architectures need to be infinitely scalable, have the ability to deal with files of all sizes equally effectively and allow access to the data via a variety of protocols and APIs. In my view, evolving traditional architectures, while possible, is a much harder route to achieving all these goals. The problems we are trying to solve today were unrecognizable when most traditional architectures in the market were conceived. A fresh start is therefore warranted.


Dave Raffo
Twitter: @DRaffoStorage
TechTarget SearchStorage

Storage Expert David Raffo

Enterprises need to be most concerned with new types of data that they haven’t seen much until recently. Yes, traditional data – especially file data – is still growing and traditional storage systems do a good job of handling that. But machine-generated data and data used for DevOps is also growing. These new data type often require different methods and tools for storing and managing, and those new tools must be designed for people outside of traditional storage admins.


Enrico Signoretti
Twitter: @ESignoretti
Juku.it

Storage Expert Enrico Signoretti

The real problem is that we are only at the beginning. Talking to IT organizations and the teams that are developing next-generation connected products it is clear that sensor data collection, analytics and archiving are major issues. Some of them are experiencing exponential data growth, for others the growth is still limited but there are also other aspects to consider, especially when it comes to the applications that are accessing that data. In fact, it’s no longer a problem of storing data safely but, depending on the type of applications you are developing on top of it, you need a flexible infrastructure that can cope with disparate and unforeseen requirements. Scalability, performance, flexibility, ease of use and efficiency are the key aspect to watch in a modern storage system that has to deal with unpredictable data growth while adapting rapidly to any kind of application requests.


Stephen Foskett
Twitter: @SFoskett
Gestalt IT

Storage Expert Stephen Foskett

Enterprises should move from seeing storage as a point solution per application to an enterprise-wide resource. Prepare standard IT offerings that are flexible, scalable, and integrate with the rest of the IT stack. This will allow enterprises to absorb the vast amount of data that is coming from IoT, instrumentation and analytics, and increased use of video and sensor data.


Henry Baltazar
Twitter: @StorageZar
451 Research

Storage Expert Henry Baltazar

No “one thing” will allow you and your company to survive and thrive with the next wave of data that is emerging.  The key element will be flexibility, to not only look beyond traditional appliances and infrastructures but to aggressively embrace new technology innovations in hardware, software and cloud services; even if it requires reorganization and retraining to get current infrastructure administrators up to speed.


Based on what we heard from the experts above, a storage strategy is no longer a technical challenge but a tactical one. More important than meeting certain capacity or performance requirements is aligning your storage to a holistic technical strategy. This means having a storage solution that is future proofed for rapid innovations in hardware, software and cloud capabilities and can support new methods of data analysis. Taking a software-focused approach to storage is one thing enterprises can do to prepare for the next wave of data. By doing so, you can make your storage is invisible and your data becomes visible at scale.

Learn more about how customers are overcoming today’s massive data storage challenges by reading the Qumulo Field Report by the Taneja Group, where they spotlight six Qumulo customers.

Back to Blog
    RSS
    Twitter
    LinkedIn
    Facebook
    Google+
    http://qumulo.com/blog/preparing-for-the-next-wave-of-data-storage-experts-weigh-in/">