For more than a decade, the cloud has been ascendant. When it first appeared, the cloud was seen as the savior of storage, offering a less expensive alternative to traditional on-premises systems. IT organizations operated under a top-down mandate to go “cloud first,” but soon realized that it was not, in fact, always the most cost-effective infrastructure option.
Since then, a number of mature startups and enterprises have moved their infrastructure data back on-premises, where they enjoy more control and better economics. But still the cloud beckons with formidable simplicity, agility, and yes, cost efficiency in many cases. That is because we are now smarter as an industry about what belongs in the cloud and what does not.
In the past, IT had to decide between either moving entirely to the cloud or keeping data on-premises. This was because of the difficulty in building a truly hybrid system to store data. Imagine a file storage system that has been in use for years and has millions of files, some of which require immediate accessibility and some of which must be archived—but it is almost impossible to determine which is which.
Today, with intelligent data management tools and flexible software-defined systems that span on-premises and multi-cloud deployments, companies can start to exploit the cloud and deploy systems that take advantage of cloud efficiencies in a way that optimizes cost, agility, access, etc. This is made possible by smart software that understands the profile of data, has the ability to access data anywhere, and therefore can move it in an automated fashion based on business rules.
With an intelligent hybrid system, data is moved to the cloud or from the cloud, as needed, to optimize cost and performance based on business needs, automatically and effectively. The software-defined system becomes the unifier across storage systems and the intelligent layer that controls data placement in an optimal way.
Request a live demo to learn more about DataCore SANsymphony™ software-defined storage.