Exponential expansion of data needs to be addressed
That may have been a storm in a teacup, but now there is a very real consideration that needs to be addressed - the exponential expansion of data and just how enterprises, organisations and people will cope. Unlike the Y2K threat, this explosion of data is not a phenomenon where we can wait and see with bated breath. For this reason, it has become imperative that storage evolves, and do so efficiently and, most importantly, securely. Massive data expansion is already under way and it is a reality that will persist and escalate.
This is true, particularly as the Internet of Things (IoT) comes to the fore, in which almost anything could include a sensor, be connected to the internet and generate data logs. Recently, Gartner indicated that connected IoT devices are estimated to swell to the region of 21 billion units, and that is just in the next four years.
End of an era
What this means is that the era of traditional data storage exists only in our rear view mirror. Enterprises simply cannot afford to continue down a path of the conventional data storage architectures that we have seen implemented in the past 20 years, or they will never get ahead of the game. Already, enterprises are starting to run into a brick wall, where business demands and initiatives have been put on hold due to existing and legacy data storage architectures that need to be upgraded.
Clearly, a change is imperative, as both the sources where we are now receiving data from and the volumes of the data itself, are demanding that we move from traditional data storage solutions to embrace a data management philosophy. This philosophy entails that the data management tier become agnostic of applications and of media, much like software defined storage breaks the bonds between specific data and the hardware on which it resides. This brings considerable benefits, including greater operational efficiencies and lower costs in SAN and NAS environments alike.
Furthermore, this philosophy provides for the flexibility for data to be stored in the cloud (public, private or hybrid), on flash, disk or tape, or a combination of these.
Capability to stay ahead
Such a data management philosophy, if incorporated into the data management architecture, will ensure the capability to stay ahead of whatever challenges are faced by business in future, as we move from petabytes to zettabytes of data that need to be stored and managed. The good news is that organisations can begin to take action now, by carefully considering and meeting each of the five pillars of a successful data management strategy. The five pillars are:
• Non-disruptive operations - this mitigates against downtime, which in our information-dependent world has become unacceptable.
• Cloud integration - another imperative, and should be the standard in storage and data management architecture.
• Cost efficiencies.
• Performance, which can be addressed by incorporating flash architectures.
• Automation - as the final pillar, this needs to be considered across the enterprise, while incorporating all the above components seamlessly and irrespective of applications or hardware.
Next explosion
One thing organisations cannot afford to do is remain complacent. The Protection Private Information (POPI) Act, South Africa’s version of legislation aimed at ensuring accountability from organisations gathering and storing individuals’ data, together with the business requirements to collect, store and analyse new data sources from new devices, is going to be the next explosion of requirements from a data storage perspective.
Furthermore, while storage optimisation efficiencies have been well identified and executed by most enterprises, it has typically been in the structured data environment and database optimisation. This will not be sufficient to meet the next five years of business demands with the inclusion of unstructured data and video into the data management solutions needed by enterprises. However, if organisations do take full advantage of the latest storage technologies that are available, this can pave the way to ensure that their capabilities are up to the task of living in a data boom the likes of which we have never seen before.