“Organizations can spend a huge amount of time and money just to discover a new database solution. But, it will not give them what they need. It’s a huge risk that can take years to accomplish for mature databases. For most companies, this is just not an option,” says Derek Swanson, CTO, Silk, in an exclusive interview with EnterpriseTalk.
ET Bureau: What are the challenges faced by enterprises while managing their cloud database?
Derek Swanson: What are the primary challenges for managing cloud databases? Lack of features, functions, performance, difficult cost control and limited scalability compared to traditional on-premises solutions. It is a very complex subject for every large organization.
Organizations want the performance, control, agility, and rich functionality of running on-prem, without maintaining costly data center operations. Today, it’s not possible to achieve all of these things for large enterprise databases using solely cloud-native resources.
There are many designs, operational, governance, and cost issues that need to be thought through and addressed properly for successful implementations of cloud-based databases.
To begin with, does the organization have existing on-premises “traditional” databases with complex and well-established business logic already? If they don’t, but rather are looking to build new systems using cloud-native resources, it is certainly simpler but brings other questions and challenges around implementation models and cost. If looking to migrate existing systems from on-prem into the cloud, there are a few primary challenges.
Data performance is another huge problem—throughput on cloud systems is extremely low and latency is quite high for anything over a few hundred MB/s. Scaling performance into the GB/s range at 1ms latencies (what they get with on-prem solutions) is extremely difficult, very costly, and very complex to an architect if even achievable at all.
Building a hybrid solution with some elements running on-prem and others running in the cloud is a partial solution to this problem, however, it usually is not a good fix for databases due to the distances involved.
Database applications can generally run in the cloud fine because they are usually smaller and lightweight enough to not exceed cloud performance limits. But the latencies between the cloud and on-prem can be unacceptably high. Therefore, the “data gravity” of the database anchors all the other applications on-prem as well.
Organizations can spend a huge amount of time and money just to discover a new database solution. But, it will not give them what they need. It’s a huge risk that can take years to accomplish for mature databases. For most companies, this is just not an option.
Migrating smaller support databases with little business logic and small performance requirements is much easier, but those databases are usually not business or mission-critical.
Cloud IaaS database deployments are done by the organization and are subject to the overprovisioning problems mentioned earlier. As soon as database requirements exceed a threshold, larger instances must be provisioned in a bulk fashion; this inevitably causes a great deal of waste as resources sit either completely unused or heavily underutilized.
Gartner has estimated about 35% of all cloud resources being paid for today are completely unused due to overprovisioning.
Suppose the organization uses a DBPaaS or DBaaS. In that case, there are extremely stringent limitations on performance, data services, and management capabilities, and the costs to achieve even minimal performance are quite high.
ET Bureau: How can enterprises leverage innovative technologies to enhance the performance of their cloud environment as well as reduce related expenses?
Derek Swanson: There is a huge amount of investment going to cloud efficiency and optimization practices and software. The cloud today is notorious for complex and somewhat arcane billing practices, difficult to understand limitations, quotas and provisioning requirements, and a general lack of widely understood technical capabilities and how all of these hundreds of pieces of software, infrastructure, and processes work together.
Large enterprises especially can leverage 3rd party software in three primary ways to control costs and enhance governance. First, they should use a billing analysis company that understands and can explain clearly what the enterprise is paying for and what they are actually getting.
Second, perhaps they must leverage a cloud services company to analyze what they are actually using. These cloud services can look at the environment and explain what resources are well used, under-utilized, and those that aren’t being used at all. Third, they should deploy non-cloud native platform tools to replace or augment cloud-native services that underperform or that are too costly.
If enterprises understand what they’re paying for and what’s being delivered and how much of that they are really using, then they can take positive steps to eliminate waste and replace underperforming or excessively costly elements with robust 3rd party solutions.
ET Bureau: What steps can enterprises take to secure the data in their cloud environment?
Derek Swanson: Securing data is about proper security governance of the zero-trust cloud security model and making sure enterprises only grant the minimum required level of access to each component in their environment, then audit everything.
The cloud handles security pretty well as long as the end-users configure the zero-trust model properly. Human error and misunderstanding are the biggest problems today, so using a security analyst and consultants to conduct audits and pen-tests of the environment are extremely important to try and find holes and leaks before thieves do.
ET Bureau: How can enterprises effectively scale their data in the cloud while consistently delivering high-performance applications?
Derek Swanson: Scalability in the cloud is a bit of a paradox. If enterprises have applications designed for horizontal scaling or sharding that are built to plug into control and auto-scaling services like Istio and Kubernetes and are designed to run as micro services or that behave elegantly while running in containers, then scaling effectively for performance is achievable today.
However, this kind of solution is only applicable to simple applications. By far, most high-performance applications are delivered on a traditional vertical scale on closed systems that cannot be containerized or orchestrated by modern platforms like Kube. Moving those application stacks to the cloud has not happened yet, as they require a refactor if they can even be rewritten at all.
Derek Swanson has 20 years of experience as a technology evangelist, systems architect, and data systems engineer. At Silk, he manages the worldwide sales engineering organization and is the senior customer-facing technologist and product evangelist in the organization. Prior to Silk, Derek has had a successful career at Sorenson, Code Communications, and Unisys. He holds a Bachelors in Political Science and Government from Brigham Young University