According to research from Gartner, more than 50% of companies will use container technology by the year 2020. This comes as no surprise as containers bring unprecedented mobility, ease and efficiency for rapidly deploying and updating new microservices and existing applications, and they will play an increasingly larger role in IT in the years to come.
In fact, according to recent research we conducted, 46% of respondents stated that they had containers deployed in either production or development/testing. That said, as container technology matures, it needs to solve new challenges in order to be deployed broadly, especially for business-critical applications. The two main areas where container technology needs to mature are security and persistent storage.
These challenges are evidenced in DataCore’s recent State of Software-Defined, Hyperconverged and Cloud Storage report, which found that the following surprises and unforeseen actions have been encountered by users following container implementation:
- Lack of data management and storage tools
- Application performance slowdowns – especially for databases and other tier-1 applications
- Lack of ways to deal with applications such as databases that require persistent storage
In order to overcome these challenges, IT organizations must have the ability to meet the same data storage requirements for the applications that are being migrated or created under a container architecture. Meeting these requirements need data services that are currently provided to traditional application architectures. Storage services will become more important as container deployments move from evaluation and testing phases to production deployments.
Above all, to facilitate migration and accelerate adoption, a storage solution needs to be capable of providing shared storage to both existing (virtualized and bare-metal application infrastructures), and container-native applications, and do so with a consistent set of data services. In other words, a modern storage solution must provide DevOps teams with persistent, stateful application data, allow consuming storage on-demand, and provide the same level of availability and performance as currently provided to traditional application infrastructures.
Software-defined storage (SDS) is an optimal solution for providing a consistent set of data services and shared data access across different types of application and leveraging different types of storage systems.
A modern SDS platform should offer administrators the ability to present persistent storage to container hosts deployed, VMs on virtual hosts, and bare-metal. The presentation of persistent storage should be done through native controls of orchestration solutions, such as Kubernetes, and leverage advanced storage capabilities like auto-tiering, synchronous mirroring, and continuous data protection (CDP).
By leveraging the power of software-defined storage, users can manage the provisioning of storage to container deployments with the same platform as the other application workloads, while providing the same level of enterprise storage services required for all critical production environments. In addition, it allows instant cloning which simplifies and speeds development and test processes.
As SDS becomes a better understood technology in the world of containers and Kubernetes, it will surely accelerate their pace of adoption for real-world applications.
DataCore recently released two plugins that enable containerized applications and databases to access persistent storage volumes through its software-defined storage services. Both provide numerous options for advanced data protection and data recovery for dependable, predictable behavior at scale. Read the press release to learn more, or talk with a DataCore expert to understand how to dynamically provision highly-available, persistent shared storage to containerized applications with a software-defined infrastructure.