VSM: What are the biggest surprises organizations today are encountering in their server and desktop virtualization projects?
AG: From where I’m standing, lack of attention to storage-related matters generates the most turmoil and angst. You see this especially as customers struggle to migrate their line of business applications to virtualized environments. The storage provisioning, maintenance, and tuning practices that they mastered in segregated server configurations largely break down. This collapse ripples across to business continuity procedures and data protection techniques, upsetting even their hardware refresh and purchasing rhythm.
This is understandably so. Versatile virtual machines and virtual desktops lure concentration away from the mundane, seemingly stiff realities of physical storage devices. But today, we can harness a close relative of server virtualization, storage virtualization software, for an operationally and economically attractive solution – if we plan ahead.
VSM: What are the consequences of neglecting storage-related problems?
AG: There’s no turning your back on these issues. Unanticipated storage costs, availability concerns, and performance bottlenecks are the most critical factors bringing server consolidation and desktop virtualization projects to a standstill.
In fact, in DataCore’s recent survey, “The State of Virtualization”, we found that a majority of medium- and large-enterprise IT organizations mistakenly overlook storage when implementing a virtualized operating environment. The data revealed that nearly half of the respondents (43 percent) had not anticipated the impact that storage would have on their server and desktop virtualization costs or had not started a virtualization project because the storage-related costs “seem too high.”
This has moved the storage issue – its high cost, inadequate performance, inflexibility, and vendor lock in – to the front burner in virtualization discussions.
VSM: How does the rise of “Big Data” compound the concern for data center managers? How can they better deal with this issue?
AG: Whereas server and desktop virtualization help consolidate and concentrate storage in common pools, Big Data has the opposite effect. It’s widespread – “a lot here, a lot more there” – and there are too many places to keep track of. Add to the location problem, different brands and generations of storage devices that house it, and you can see the predicament.
Some pundits suggest standardizing on specific hardware storage blocks to retain some sanity. But they ignore the insanely fast rate at which hardware becomes obsolete, which leads to healthy storage diversity in the first place. Not to mention the outrageous costs to forklift upgrade everything already in use.
My advice is don’t even try to solve the hardware problem by confining yourself to one piece of gear, it doesn’t stand a chance. Instead, steal a play from the desktop virtualization handbook. Accept and encourage equipment diversity while taking measures to uniformly manage it. In the world of disks, you do this by layering a common control plane across central and scattered storage assets using storage virtualization software.
The software lets you take advantage of the many nuances that help differentiate one disk array from another as well as the extra safeguard that separate sites afford you, without burying you in minutia.
VSM: What additional benefits does device-independent storage virtualization bring to virtualized IT environments and how does it enable those who have approached it properly?
AG: There’s just too much to cover here. Think about how hypervisors liberate you from the constraints of physical server cabinets and you can quickly grasp the beneficial operational and financial impact of storage virtualization. The storage enclosures no longer define you, nor limit your choice or mobility. When one box or one site is out of service, another one seamlessly takes over for it. Achieving cost-effective, highly-available, and fault tolerant disks that traverse hardware boundaries would be on the top of the list.
With regard to economics, enhanced capital asset utilization, increased efficiency, and lower operational costs always get the most praise from DataCore customers. The benefits touch nearly every part of their IT infrastructure. As a result, organizations rapidly expand the scope of their storage virtualization program following the initial rollout to encompass conventional physical servers as well.
VSM: Given the urgency, how does DataCore expedite the successful transition to a fully virtualized storage infrastructure?
AG: First and foremost, DataCore leverages storage and server assets already on the data center floor to get you going right away. There is no waiting to “rip and replace” equipment before you get started. It’s like moving right into the house remodeling project without going through the arduous and dusty demolition phase. None of that uneasy feeling that you may be way over your head.
Instead, the conversion occurs in self-paced stages, with small incremental investments, rather than large upfront capital expenditures. I’d say it’s a very pragmatic cutover matched to your priorities and windows of opportunity. Really, nobody wants to take a big leap and you don’t have to with DataCore.