Linda Haury to direct DataCore’s strategic marketing efforts
“DataCore is positioned to become a powerhouse in one of the industry’s fastest growing software segments,” said Haury. “Coming out of the desktop and server virtualization space, I can tell you firsthand how essential they are to virtualization projects. Needless to say, I’m eager to commence this exciting role.”
Big Data: The Time is Now for Managing It and Leveraging the Advantages
http://www.dbta.com/Articles/Editorial/Trends-and-Applications/Big-Data-The-Time-is-Now-for-Managing-It-and-Leveraging-the-Advantages-75883.aspx
Big Data Storage
Leveraging the advantages big data has to offer in terms of global insight and better customer understanding requires smarter data management practices. Consider the storage side of the issue, another matter that makes big data perplexing to many organizations and administrators. A sizable segment of companies with fast-growing data stores in the IOUG survey spend more than one-fourth of their IT budgets on storage requirements.
Data growth is quickly becoming out of control and drives over-spending when it comes to data storage. Despite increasing IT budgets, the growing costs associated with storing more data create a data affordability gap that cannot be ignored. The response to this growth has been to continue funding expansion and add hardware. Technology advances have enabled us to gather more data faster than at any other time in our history which has been beneficial in many ways. Unfortunately, in order to keep pace with data growth, businesses have to provision more storage capacity which drives costs.
While there has been a relentless push in recent years to store multiplatform data on ever-larger disk arrays, big data demands moving in a different direction. “In contrast to years past, where information was neatly compartmentalized, big data has become widely distributed, scattered across many sites on different generations of storage devices and equipment brands-some up in the clouds, others down in the basement,” George Teixeira, president and CEO of DataCore Software, tells DBTA.
As result, centralized storage is “both impractical and flawed,” Teixeira says. “It’s impractical because it’s incredibly difficult to lump all that gargantuan data in one neat little bucket. It’s flawed because doing so would expose you to catastrophic single points of failure and disruption.” To address widely distributed data storage, he recommends approaches such as storage virtualization software, backed up by mirror images of data which are kept updated in at least two different locations. “This allows organizations to harness big data in a manner that reduces operational costs, improves efficiency, and non-disruptively swap hardware in and out as it ages,” he says.
CIO update: The Pros and Cons of SSD in the Enterprise
Storage virtualization software helps shape the shared storage infrastructure required by virtual IT environments and takes good advantage of SSDs where appropriate, reducing the write duty-cycle by caching upstream of the cards to minimize actual writes to media. This effectively extends the useful life of these premium-priced assets.
“It can also make modest sized SSDs appear to have much larger logical capacity by thin provisioning its resources,” said Augie Gonzalez, director of Product Marketing at DataCore Software. “The sage advice is consider SSDs as one element of the physical storage configuration, but put your money on device-independent storage virtualization software to take full advantage of them.”
TechNet Radio: IT Time – The “Caring and Feeding” for Shared Storage in Clustered Hyper-V Environments
Its IT Time, and in today’s episode Blain Barton interviews Augie Gonzalez, Director of Product Marketing at DataCore Software. Tune in for this lively conversation as they discuss how DataCore’s SANsymphony-V can help solve potential storage related issues in Hyper-V clusters.
‘Virtualizing’ disparate storage resources
Would you like to get a point-in-time snapshot of your disparate storage resources regardless of which disk array supplier houses your data? Would you like to create a universal storage pool out of your Fibre Channel, SCSI, EIDE, and IBM SSA drives? Would you like to substantially improve performance yields from equipment already on your floor?
According to DataCore Software, these are just a few of the advantages that network storage pools powered by the company’s SANsymphony are providing – without shutting down, rewiring, or overloading application servers.
“Currently, anyone who has legacy SCSI disk arrays alongside Fibre Channel and EMC boxes are having to use various, vendor-specific utilities to monitor and control their storage resources,” says Augie Gonzalez, director of product marketing for Data
Core Software. “With SANsymphony, a central administrator can take overall custody of disparate storage resources in a homogeneous way. No longer do they have to be overwhelmed by each particular disk vendor’s administrative nuances.”
Gonzalez says SANsymphony’s built-in caching algorithms are adding new value to existing storage assets. “Many people have equipment on the floor that doesn’t live up to its performance potential,” he says. “Now you can come in after the fact and accelerate their I/O response time…“