How Kubernetes and AI will combine to deliver next-gen services [Q&A]

The popularity of Kubernetes has led to its rapid adoption, but as with any advanced technology, the benefits come alongside challenges.

Being able to take full advantage of the technology means understanding what it can offer and how it fits with other developments like artificial intelligence. We spoke to Tobi Knaup, CEO of independent Kubernetes platform D2iQ, to find out about the current state of Kubernetes and what it promises for the future.

BN: How would you explain Kubernetes to a 10-year-old? And how can the discourse and education around Kubernetes change to better raise awareness of its benefits?

TK: Kubernetes is like an orchestra conductor deciding how many violins should play, which part they'll play, and at what pace and volume. Without the conductor it would be chaos. None of the musicians would know what was going on. Kubernetes is essentially the conductor for applications.

Here's another way to look at it: Kubernetes is Greek for 'helmsman', the person in charge of steering the ship. They direct and control all elements to make the ship sail in the right direction as fast as possible. Without them, that boat might just be sailing in circles.

In more technical terms, Kubernetes automates cloud applications, ensuring the right compute, network, storage, and configuration. It manages an organization's entire portfolio of applications on any infrastructure, including cloud, data center, and edge. Organizations are moving to this approach because it gains them agility, speed-to-market, and more efficient operations. They can spin up, terminate, update, and scale up applications with better resilience, governance, security, and visibility, while reducing operational costs.

Surveys show large awareness and adoption rates, even outside the cloud native community. CNCF's latest survey found Kubernetes crossed the chasm to become mainstream. According to CNCF’s respondents, 96 percent of organizations are either using or evaluating Kubernetes -- a record high since surveys began in 2016. Even the US military has issued strategy documents describing its intent to move to open source and Kubernetes.

With that said, there are still organizations yet to deploy Kubernetes, often due to struggling to cope with the technical complexity. Part of this is because many organizations are still new to Kubernetes. They're at the beginning of their journey to building next-generation products. Consequently, many underestimate the complexity of Kubernetes and overestimate their ability to deploy and manage it. There's clearly a shortage of talent here. We often see warning signs of when a company's Kubernetes deployment is at risk, such as the platform being late, over budget but still not in production after several months, or writing custom scripts to manage Kubernetes.

These are usually bad signs and we need more training to help developers get from zero to one and build a center of excellence that allows them to become successful eventually.

BN: How would you rate the current maturity level of Kubernetes? How is it serving customers now and what does the future hold in store for the technology?

TK: Across the industry, many organizations have moved to the cloud, but few can consider themselves cloud native. It is much more common for specific pockets, teams, or projects to be further along in their adoption journey, leaving the rest of the organization behind. Teams that have built a platform tend to move tenants onto their platforms using a very labor-intensive approach.

Being able to take advantage of a cloud-native platform also requires some level of re-architecture for existing apps to achieve the clear value proposition of cloud native. To meet the cyclical demands of the business, you need things such as automated fault tolerance, self healing, zero-downtime application deployments, improved speed for new feature development and releases, and elastic scaling.

Putting an application in a container is simply not enough. Once your application has been deployed, if something goes wrong, you have to understand what's happening under the hood. We're at a stage of 'mass interest' in Kubernetes to build platforms, but a lot of people still haven't figured out how to manage or deploy it.

A lot of organizations are still going down the DIY route of building their own platforms, and often we see organizations spending six or nine months building them without a single application up and running. Beyond delaying product launches, this skills gap can leave room for lots of cybersecurity vulnerabilities via misconfigured clusters.

That said, there's an evolving ecosystem of platforms and tools around Kubernetes that, as we've seen with lots of other technologies, will be a big enabler for more companies to obtain value. Taming the complexity of Kubernetes yourself is a beast of a challenge, and to mitigate that there are cloud services and complete Kubernetes platform solutions that make it easy to use.

The most important thing about Kubernetes, though, which I firmly believe, is that the future will be a combination of cloud native and AI, which we call 'smart cloud native'.

BN: How will AI (the 'smart' component of cloud native) combine with Kubernetes to improve productivity and workflows?

TK: Much like when mobile phones became smartphones, we will see a seismic shift in functionality generate whole new categories of applications and products -- across all industries.

Smart cloud native apps are apps with AI built in, which we think are going to be the key attributes of the winning products of tomorrow. Many organizations are still early in their cloud native journey, but those at the forefront of innovation, who have been building data-driven products for a while, are starting to leverage AI to gain better insights into that data and build superior AI-driven products. AI has been built into our smartphones for years and powers magical functionality such as autocomplete, digital assistants, the camera, etc. It's also at the core of self-driving cars, Industry 4.0, and breakthrough advances in healthcare such as early detection of cancer and other diseases.

AI workloads are operationally complex and I believe Kubernetes is a particularly good fit for automating away that complexity, delivering the same benefits of agility, efficient operations, and resiliency that it enables for other types of applications. In our last Kubernetes in the Enterprise report survey, over 85 percent of respondents said Kubernetes is their platform of choice for AI workloads.

BN: How would you summarize the potential of Kubernetes when it comes to improving society as a whole with technology like healthcare, smartcars, etc?

TK: Software has exploded over the last two decades. We already know from Gartner's previous research that cloud native platforms like Kubernetes will power 95 percent of digital initiatives by 2025, impacting not just enterprises but people’s livelihood.

Healthcare is a good example. Even MRI scanners run on software but maintaining and upgrading it has historically been painful. A medical tech company we worked with used Kubernetes so they could update software on these machines over the air without having to send a technician to each hospital -- kind of like how Tesla updates its cars. That's a much better experience than devices not being available for a couple of hours during maintenance. They can also get live telemetry to improve their product.

There’s some fantastic research fueling the desire to run AI applications in these scanners. For example, the collaborative research project Fast MRI is investigating the use of AI to make MRI scanners up to 10 times faster, cheaper to operate, and less data-consuming. The goal is a greater patient experience overall.

There are many other examples of how smart cloud native apps yield better products companies can sell. Operating in environments like multi- and hybrid-cloud, on-premise or edge will be super important to the 'smart' component because the AI needs to run wherever the data is. The new data often sits on the edge (MRI scanners are a great example). You want your Kubernetes platform to run in the entire fleet of an organization's heterogeneous environments to provide a consistent way to securely operate and govern your whole cloud infrastructure.

Image credit: postmodernstudio/depositphotos.com

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.