The drop in cost of SSDs (solid state drives) and flash memories has already had a sizable impact on IT organizations and this will continue in 2012. The latest models of storage systems incorporate these innovations and tout high performance and high-availability, but they remain beyond an acceptable price point for most companies. In addition to price, SSDs and flash memory useful lifetimes are impacted by the amount of write traffic and need to be monitored and protected to avoid potential data loss.
However, the big driver for SSDs is the need for greater performance, but performance needs do not apply equally to all data. In fact, the majority of data, on average 90%, can reside on low-cost archives or mid-tier storage. Meanwhile, the major storage vendors continue to implore us to throw new hardware systems and more SSDs at the problem because they want to sell higher priced systems. These innovations are great, but they must be applied wisely. To get the most value out of these expensive devices, software that can protect and optimally manage the utilization of these costly resources as well as minimize write traffic is now a “must-have.”
In 2012, businesses will gain a better understanding of why software is needed to expand the range of practical use cases for SSDs and to automate when, where and how best to deploy and manage these devices. Auto-tiering and storage virtualization software is critical to cost-effectively optimize the full utilization of flash memory and SSD-based technologies – as an important element within the larger spectrum of storage devices that need to be fully integrated and managed within today’s dynamic storage infrastructures.