This year marked the 20th anniversary of the publication of Sebastian Junger’s “The Perfect Storm,” the tale of a commercial fishing vessel that was lost at sea after a confluence of problems. Likewise, a combination of conditions have come together to form a perfect storm of storage disruption that existing large-scale file systems will find hard to weather. Consider: IDC predicts that the amount of data created will reach 40 zettabytes by 2020 (a zettabyte is a billion terabytes), and we’ll be awash in more than 163 zettabytes by 2025. That’s 10 times the volume generated in 2016. About 90 percent of this growth will be for file and object storage. This monumental increase will leave companies wondering how they will be able to manage the ever-increasing scale of digital assets. The data explosion is just getting started. Machine-generated data, virtually all of which is file based, is one of the primary contributors to the accelerating data growth. Another factor is the trend toward higher resolution digital assets. Uncompressed 4K video is the new standard in media and entertainment, and the resolution of digital sensors and scientific equipment is constantly increasing. Higher resolution causes file sizes to grow more than linearly. For example, a doubling of the resolution of a digital photograph increases its size by four times. As the world demands more fidelity from digital assets, its storage requirements grow. * At the same time, huge advances have occurred over the past decade in data analysis and machine learning. These advances have suddenly made data more valuable over time rather than less. Scrambling to adapt to the new landscape of possibilities, businesses are forced into a “better to keep it than miss it later” philosophy – i.e. companies have become data hoarders.
We are always looking for new challenges in enterprise storage. Drop us a line and we will be in touch.
Enter a search term below