In this Tech Barometer podcast, Tobi Knaup, general manager for Cloud Native at Nutanix, explains what’s accelerating cloud native application...

Tech Barometer – From The Forecast by Nutanix

Tech Barometer – From The Forecast by Nutanix

AI Cloud Native and Hybrid Cloud Work Together

AUG 1, 20246 MIN
Tech Barometer – From The Forecast by Nutanix

AI Cloud Native and Hybrid Cloud Work Together

AUG 1, 20246 MIN

Description

In this Tech Barometer podcast, Tobi Knaup, general manager for Cloud Native at Nutanix, explains what’s accelerating cloud native application development and how enterprises run these apps across hybrid multicloud IT environments.

Find more enterprise cloud news, features stories and profiles at The Forecast.

Transcript (AI generated):

Tobi Knaup: Cloud Native really is a concept that describes how to build and run modern applications. Those applications are typically microservice-oriented, they run in containers, and they’re dynamically managed. On top of a container platform, typically that’s Kubernetes, that’s really become the industry standard. Kubernetes runs anywhere. You can run it on the public cloud. You can run it on a private cloud. You can run it on the edge. So if you’re building applications on top of Kubernetes in containers, that makes them truly portable so you can run them anywhere, hybrid multicloud.

Jason Lopez: That’s the voice of Tobi Knaup, the general manager for Cloud Native at Nutanix. In this short podcast, the editor of The Forecast, Ken Kaplan, chats with Tobi about the integration of cloud-native applications and hybrid multi-cloud environments. In this discussion they touch on cloud-native applications across various cloud environments; the synergy between AI and Kubernetes; and the shift to Kubernetes for consistency and portability. 

[Related: Flattening the Cloud Native Learning Curve]

Ken Kaplan: What was the limitation that’s been unlocked with Kubernetes?

Tobi Knaup: The limitation that was there before is there wasn’t a consistent sort of packaging format for applications that made them really easily portable. And containers kind of provide that abstraction. In computer science, we sometimes call it a layer of indirection. So it abstracts applications away from the underlying infrastructure. And containers are a very lightweight way to package an application, so it’s very easy to ship them all around, all over the place.

Ken Kaplan: And their connection to managing data in different ways, is that something that you have to think about?

Tobi Knaup: Yeah, absolutely. So containers, or Kubernetes when it was first launched, actually did not have support for data. It purely ran stateless applications. So only a few years later, the community, it was actually our engineers at Data2IQ together with Google, created what’s called the container storage interface. And so that became the industry standard for attaching storage to containers. But that was still very bare bones, just simple volumes attached to containers. And so what we did here at Nutanix recently with NDK (application-level services for Kubernetes), really takes that to the next level. Really adds capabilities for disaster recovery and resilience. So makes it really, really easy to run these kind of sensitive stateful applications on Kubernetes.

Ken Kaplan: Awesome, and we’ve been hearing the word resilience and it sounds like the hybrid multicloud, building your apps in containers, these kinds of things are bringing some more control and flexibility for things that you need to keep your business running. Talk about hybrid multicloud in the sense that it’s a choice or it’s where people are today. How did we get here?

Tobi Knaup: Yeah, so I think there are many reasons for why people choose hybrid or multicloud. It’s typically not what a lot of people think at first. I think when hybrid cloud or multicloud first became a concept, people thought, people are going to look for the cheapest compute all over the world and that’s where things are running. I talked to some customers that are doing that actually, but they’re kind of rare and they’re typically hedge funds. So of course they look at market prices. But for most people, the constraints are different. It could be regulatory constraints, right? Data needs to reside in a certain geography or frankly just what an organization is comfortable with, where they’re comfortable putting their most sensitive data. Some don’t want to put it on the public cloud, right? But they appreciate the flexibility and the dynamicism of a public cloud for new developments. So they’re sort of running their steady state production workloads on prem, but they’re giving cloud environments to the development teams for building the next generation of apps, which then may move on prem later when they go to production. So I think a lot of organizations are looking at what’s the best environment for each app and that’s how they’re choosing. And now in large organizations, what we see too is that’s where the multicloud strategy is very common. Sometimes also for regulatory reasons, they have to go with a dual vendor strategy. But also large companies, they tend to acquire a lot of other companies and so the cloud that the company uses that they bought may be different from the cloud that corporate uses and so now they have to manage multiple.

[Related: Containers Progress in a World of Data Center Virtualization]

Ken Kaplan: That’s a lot of complexity. Yes. I guess we could finish here with what are some of the things that you’re seeing, you’re excited about, they’re grabbing your attention, are rising, they’re coming up, everyone’s talking about AI. AI and containers and Kubernetes, they are going together, what’s on your mind?

Tobi Knaup: Yeah, 100% they’re going together. So AI is my other passion besides cloud native. Been doing a lot of work there over the years actually. And it’s worth mentioning that Kubernetes came out of Google. It’s sort of built on the same ideas on which Google runs and they’ve been running their own AI workloads on that platform internally for years. And today, the leading AI companies in the world, like OpenAI, they’re running in Kubernetes. So if you’re using chat GPT, that’s running on Kubernetes. And so it’s a really great fit for these AI workloads because AI workloads, they tend to be very dynamic. They need to share resources, very expensive resources in this case. GPUs are very expensive and in short supply. So it’s a great fit. Also organizations want to iterate on AI very quickly. So Kubernetes really enables that, enables people to ship software fast. So with GPT in a box, which runs on Kubernetes, we’re really making it easy for organizations to put these models into production. And we’re also putting AI into our Kubernetes product, into NKP to assist Kubernetes platform engineers. So there’s a co-pilot type chat bot that’s built in that answers their questions, both sort of generic knowledge questions as well as questions about the environment. They can use it to troubleshoot. So, you know, really simplifies their lives.

[Related: Cloud Native Architecture Critical to 5G Success]

Ken Kaplan: Everything’s coming together.

Tobi Knaup: It is indeed.

Jason Lopez: Tobi Knaup is the general manager for Cloud Native at Nutanix. Ken Kaplan is the editor in chief of The Forecast. They spoke in Barcelona, Spain at .NEXT 2024. This is the Tech Barometer podcast, a production of The Forecast. For more podcasts, video and articles about technology and the people in tech, check it out at www.theforecastbynutanix.com.