Containers hold the key for DBAs who want to gain database independence.
Disruption is the only constant now in business driven by a new generation of technologies and web-based services. The rise of Uber and AirBnB, along with the threat of Amazon disrupting traditional industries, is forcing organisations to revamp their IT strategies. Today, businesses are demanding IT infrastructures that are agile and dynamic, as this is the only way to react in a timely manner to ever changing customer demands.
The Cloud was seen as the technological breakthrough that would enable your organisation to build a responsive IT infrastructure, but you have a challenge going down this route. The business model of Cloud Service Providers is reliant on driving customers to stay on their platforms, which undermines one of the purported benefits of Cloud, namely data portability. This does not chime with your requirement for agility and greater freedom to scale and migrate your IT environment. Portability enables you to move workloads between on-premise and Cloud environments, as well as between service providers to secure the best deals and dynamically manage workloads. At a time when budget constraints and agility compete for dominance such portability is critical.
Thankfully, another technology has come along to provide you with the flexibility you crave. Containers are the “new, new thing” with various studies suggesting their adoption is on the rise. A recent Datamation survey said 31 per cent of respondents are using containers as part of their Cloud deployments while another 28 per cent are actively considering containers. As Datamation says, “This indicates that container technologies have matured and will gain large-scale enterprise adoption in the next few years.”
I would go as far as to say that containers are key to your IT independence, because they will enable you to minimise your reliance on individual service providers, increase agility and data portability. They should become a central pillar of your IT strategy as you adapt to the threats and opportunities presented by the likes of Amazon and Uber. Indeed, if you haven’t already started using containers then you need to hurry up, because they are fast becoming standard in enterprise IT environments. In 2 – 3 years I predict they will be well established and those companies who have not embraced them will be at a distinct disadvantage.
The stateful challenge
It is widely accepted that Cloud Computing is integral to IT strategies as organisations increasingly shift to multi-cloud or hybrid IT models. For example, in 2018 Forrester estimated that 74 per cent of enterprises described their strategy as hybrid or multi-cloud with 62 per cent of public Cloud adopters using more than two cloud environments or platforms. Logically, this approach strengthens the case for containers, because the Cloud makes it easier to deploy them.
The big container challenge for databases, especially relational ones, is that they are stateful, whereas the container platforms have been designed for ephemeral, stateless workloads. To use a common expression from the field: relational databases are typically treated like ‘pets’, whereas container platforms were designed for ‘cattle’. If you spin up database instances and do not pay attention to this issue you will create problems for your company. Historically, containers and their platform have not been good at providing facilities for stateful workloads, yet as we move to the hybrid model it will be essential that IT teams are able to interrogate data dispersed across on-premise and Cloud applications. For DBAs, despite this challenge, it feels as though resisting the advance of containers is almost a career limiting move. Demand from the business is forcing DBAs to weigh up the complications of turning highly customised, relational legacy databases into containers against the potential of containers to increase the agility. They are looking to understand how they can make the move with the least possible risk.
As someone who has advised DBAs I feel your pain! That said, if you haven’t done so already, I recommend that you start your journey with containers. If you can get ahead of understanding the potential and pitfalls of the technology, then as containers increasingly mature you will have an advantage in your depth of understanding and the skills you have built up.
Considering a container filled future
Clearly, do not run before you can walk. Deploy containers in test environments first and experiment with them, but it is also important to consider how their use will evolve as the technology matures. I liked recent comments from Sebastian Krause, IBM where he saw the adoption of containers at the network edge as a trend. This makes sense, especially in highly complex, enterprise environments which over time have built up highly customised legacy environments. Operating at the edge, you do not have to shift all your applications at once.
This also gives the technology the time to mature and for more tools to develop, which address some of the key challenges around areas such as:
- Connectivity to storage and ensuring you have the right storage capacity
- Understanding the dependencies between your existing applications as you shift to containers
- Which tools you need to adopt to address statefulness
- How you ensure you have the right networking for high throughput and low latency
- Avoid misconfigurations which could lead to security vulnerabilities
The leading CSPs are beginning to acknowledge the importance of portability in a multi-cloud world. The good news is that there is still time to understand the limitations of containers for relational databases in terms of features and security, and more importantly how you implement workarounds. The key in this phase is to identify the gaps and work out how you can address the shortcomings. From my perspective I would suggest there are four key areas:
- High availability – databases in mission-critical environments must remain operational 24/7, so it is vital to automate database management to minimise the implications for business operations
- Monitoring – again obvious and following from the previous point, but having the right approach to monitoring your containers to identify potential issues is key
- Disaster recovery – as with traditional database environments, redundancy of one component (containers) is no guarantee against data loss
- Routing and load balancing – to ensure your containers are managing workloads efficiently it is critical to develop the right approach to shift workloads
Resistance is futile
Containers will offer you choice and control, which is very important as you decide when to scale or shift workloads in order to maximise performance and drive cost efficiencies. This will have a fundamental impact on the dynamics and economics of the database market, because containers will give you the portability you want, so you will no longer be as dependent on the traditional commercial database providers.
How containers influence database modernisation will be one of the major decisions for IT departments in the next few years. As last year’s Diamanti survey revealed there is still a long way to go for containers in the database market. Only 30 per cent of its respondents said there was a use case for database containers compared to 55 per cent who saw containers as suited to new cloud-native applications.
That said, I believe database innovation is integral to successful digital strategies and containers will be the key enabler for this innovation. The value of containers is clear. A recent survey for Portworx and Aqua Security underlined the key benefits with 37 per cent of respondents saying developer productivity, 20 per cent increased agility and 19 per cent the ability to avoid being locked into a single vendor.
Containers are maturing fast, so do not be put off by concerns that they are not designed for the stateful requirements of relational databases. By our estimates (based on Forrester Research) there are significant benefits including:
- Reduced hypervisor licensing fees that can lower production environment costs by as much as 50 per cent.
- Process optimisations that not only streamline dev/test costs by as much as 70 per cent, but also shorten time-to-market so that revenues and other business values are captured more quickly.
- Thrifty runtime resource consumption as compared to VM deployments, since you don’t need to run as many copies of the OS—up to 80 per cent fewer servers.
While change is always uncertain and can be perceived as increasing complexity, done correctly containers offer significant value as you progress your digital strategies. Resisting containers will not just affect your competitiveness it will hinder your ability to shape your IT strategy according to the needs of your business. To put it somewhat more bluntly: no matter how determined you are it is never wise to run against the herd, especially when it is stampeding towards you.
This article has appeared on ITPro.