Why Tightly-Coupled Containers and Microservices Are a Must in Achieving Cloud-Native Storage

Why Tightly-Coupled Containers and Microservices Are a Must in Achieving Cloud-Native Storage

There has been a significant rise in the adoption of containers and Kubernetes for cloud-based service deployments and development. A customer study by CNCF (Cloud Native Computing Foundation) found that over 70% of customers ran containerized applications in production, while the rest said they planned to do so in the future. The rise and popularity of containers is due to the significant benefits containerization offers, which include efficient utilization of resources, infrastructure elasticity, improved scalability and productivity, easier deployment and configuration, continuous integration, and application portability.

Meanwhile, microservices architecture has evolved

With the rapid adoption of containerization, application architecture for building modular applications has also evolved from service-oriented architecture that was introduced a couple of decades ago to microservices architecture. Applications with microservices architecture built as inter-communicating suites of services that are independently deployable and scalable. An application that’s based on microservices architecture is designed with a number of key things in mind: decentralized governance and data management, infrastructure automation, extensibility, and resilience from failure.

The connection between container and microservices architecture

The two are clearly linked and not by chance since containerization provides distinct, natural boundaries between different microservices. However, applications can be containerized even if they are monolithic, making the container the monolith. This approach does have a downside: since the change cycles of an application’s components are linked, it means that any change made to a component requires the entire application to be updated. Doing this once or twice in an application’s lifetime may be manageable, but in reality, it becomes difficult to maintain good modular structure over time, and makes scaling inefficient, if not downright impossible.

When it comes down to it, enterprises and developers have realized that they cannot enjoy the full benefits of containerization without adopting a microservices architecture for application modernization.

Managing persistent data & storage

Initially, the adoption of containers was driven mostly by stateless applications. These containers had microservices operating as the front end, and a stateful, non-containerized back end. Moving to a fully container-based infrastructure requires applications (stateless and stateful) to be implemented as containers, which means that there are challenges involving the management of data and storage in these environments to overcome.

Today we are in a state of transition as to how the data associated with stateful applications in containerized production deployments stored and managed. Many rely on siloed legacy enterprise storage solutions that are external and not an integral part of the container environment. Despite these limitations, they are mature and provide extensive capabilities for data management, including tiering, erasure coding and data reduction for storage efficiency, and disaster recovery to name a few. In order to truly simplify management, reduce costs, improve resource utilization and benefit from what containerization has to offer, the storage environment must be an integral part of the container environment and must coexist side-by-side with containerized applications.

Making this a reality requires a new approach.

A new approach for storage architecture

When it comes to storage architecture, the holy grail has always been creating a clear separation between the control and data planes. This allows independent scaling of metadata (control plane) and data (data plane) flows and data management operations, including data mobility, tiering, and snapshots. To date, storage implementations haven’t achieved effective separation of the two planes. Numerous things stand in the way including cumbersome standards, bolt-on incremental features, and nonoptimal data flows.

Containers and microservices, brings with it the opportunity to start with a fresh approach in the storage world.

Introducing Microservices to Storage

The principles of microservices architecture can be applied naturally to the container-native storage system design. When the control and data planes are separated, a microservices-based, container-native solution has distinct entities of control and data services that scale and provide services in a distributed, scalable way. When a storage system is implemented using microservices, it doesn’t just enable the separation of the two planes – it actually forces it.

Here’s what happens when using a microservices-based container-native storage system that separates the two planes effectively:

  • Performance & capacity scaling:
    Scaling is provided on a number of axes, including capacity, bandwidth, and IOPs. This allows scaling up/down capacity and performance as needed, meaning resources can be scaled and shared efficiently across applications.
  • Resilience:
    On their own, microservices can fail to restart. When using a combined solution with container orchestration, resilience improves.
  • Data management:
    A metadata microservice can carry out many data management operations on its own without affecting the data plane. When data does require manipulation, performance issues can be minimized by decoupling operations on the metadata and data [RA2] – which also increases efficiency.
  • Storage media support:
    A microservices/container system could implement multiple flavors of the data-plane microservice, because microservices are independent and communicate with a well-defined protocol.
  • Tiering:
    Tiering operations between media types can be controlled for optimized costs and optimal data layout.
  • Data mobility:
    Objects (e.g., files, volumes, etc.) can be virtualized as metadata-only objects, referring to a common data pool across numerous types of media and geographies. This results in new data mobility capabilities in hybrid and multi-cloud deployments.
  • Storage protocol & application support:
    When an application is in the front end as a microservice, it can also be implemented in a number of flavors. This results in greater flexibility, as it supports varying storage-access protocols and application-specific access.

While the benefits of a microservices-based container-native storage system include portability, flexibility, and scalability, it’s important to also be aware of the issues that may arise. One example is maintaining strong consistency, which can be very difficult for distributed systems that need to deliver performance. Of course, this and other challenges are difficult but not impossible to solve, and shouldn’t stand in the way of pursuing a microservices-based architecture. The benefits are simply too significant to ignore.

Conclusion

Containers and microservices architecture have evolved to provide significant value to businesses, as more applications are implemented as cloud-native apps. Improved flexibility and extensibility help create solutions that today’s modern applications require, without the typical limitations of infrastructure. Embracing a microservices approach is needed to remove the limitations of storage solutions and realize the potential containers offer for application modernization.

Spread the Word


More Recent Posts
  • Containerization already plays a strategic role in 20% of global organizations, and 70% are reporting plans to adopt cloud containers within the next 6-18 months. What makes a cloud container so useful in DevOps environments?

    Spread the Word
    Read More
  • Only a few organizations are currently offering products that natively handle Kubernetes environments and the scaling challenges that come along with them. Thus, it’s best to keep up with container trends so that you can

    Spread the Word
    Read More
  • In DevOps, each day brings new trends, challenges, and solutions to light. It’s critical for all developers, platform engineers, and systems architects alike to stay on top of industry trends to continuously improve processes and drive innovation. Let’s dive into the top 5 DevOps trends for 2022.

    Spread the Word
    Read More