Tech Barometer – From The Forecast by Nutanix
In this Tech Barometer podcast, Rene van den Bedem of Microsoft’s Cloud and AI division discusses the future of AI and how cloud computing is evolving to power more aspects of life.
Find more enterprise cloud news, features stories and profiles at The Forecast.
Transcript (AI generated):
Rene van de Bedem: When I started, the personal computer was becoming more miniaturized.
Jason Lopez: Rene van den Bedem says when he started his career in computing it was 1994. The trend was smaller and more compact machines. Windows 3, with a more user-friendly, graphical interface was the dominant OS. It was a period of diversification in personal computing. This is the Tech Barometer podcast. Rene van de Bedem is Principal Technical Program Manager at Microsoft, where he does a lot of work in cloud and digital transformation. We asked him about the state of enterprise computing and he ushered us into a sort of timeline, that ends up at AI… but starts in the 90s with computers becoming more miniaturized.
Rene van de Bedem: And then the networking constructs had just come out.
[Related: Focus Shifts to Migration in Wake of Broadcom’s VMware Acquisition}
Jason Lopez: The emergence of ethernet, token ring technology, and TCP/IP, ithelped establish the building blocks for the interconnected world we live in today. It was the beginning of the transition from military and academic use to the public.
Rene van de Bedem: Jump 10 years later, we went from narrowband in telco, so you know, like PSTN, dial-up modems, 64k data circuits.
Jason Lopez: This shift from slow to faster connectivity, laid the groundwork for the high speed internet technologies that would make cloud possible.
Rene van de Bedem: Jump to let’s say 2001-2002, you had the explosion of the internet. The internet really became this mainstream thing.
[Related: IT Leaders Get AI-Ready and Go]
Jason Lopez: This was the dot-com era, with faster chips, advances in hardware. It moved us from dial-up to broadband, It was a time marked by the spread of wi-fi. Mobility was becoming a big deal.
Rene van de Bedem: So you had all of these building blocks coming together to where we are now, with the invention of the cloud back in 2006, I think it was, with AWS.
Jason Lopez: There was a fundamental transformation in information technology, where physical infrastructure was being replaced by the cloud. It gave users unprecedented levels of accessibility, efficiency, and scalability.
Rene van de Bedem: And now in 2024 with AI, we’re now on this cusp of this next rocket launch that’s coming.
Jason Lopez: AI is becoming a tool with a wide range of uses, much the way calculators did back in the 80s and 90s. Microsoft, Rene says, is integrating AI into all its products. That’s called “co-pilot.” This change signals a transformation to the era we’re entering, where AI is a must have technology.
[Related: Creating AI to Give People Superpowers]
Rene van de Bedem: People who work in an industry, if they don’t adopt these new tools, they’re going to be left behind. So in 10 years time, all jobs around the world, most of them will have some type of AI-based co-pilot that you’ll need to use to do your job, and those that don’t, they’ll just be left behind.
Jason Lopez: It’s a continual evolution. And it especially applies to tech companies which must adapt to the changing needs and challenges of storing and processing an ever-increasing volume of data.
Rene van de Bedem: Obviously, having very, very fast, expensive storage, you need that for a part of the workloads, but then the ability to archive petabytes of data so that you can derive business value from your data sets, that’s a necessity. So storage is always evolving. I’m sure it’s similar is going to be true for quantum computing. We’re going to see a shift in the way that we build our traditional computing models so that that can harness and integrate with AI as well as quantum computing.
Jason Lopez: Cloud service providers are beginning to offer quantum products in a limited way, though scalable quantum computers are not yet a reality. Right now, it’s in the realm of researchers and developers to experiment with quantum principles and algorithms.
Rene van de Bedem: Most of the cloud providers have a service that allows customers to play with quantum computing.
Jason Lopez: Unlike traditional computing, quantum computing stores information in a more complex way, with an exponential increase in processing power for certain types of problems. Rene says the future looks like a hybrid.
Rene van de Bedem: Quantum computing is not going to replace traditional computing. Every technology has got pros and cons. Quantum computing, even though the processing is happening, is not really able to maintain its state once the problem is solved. So what happens is you’ll have your quantum computing model that’s running, and then you’ll have traditional computing services wrapped around that, and all of the data, once it’s solved, goes into traditional computing software constructs, I suppose you would say, to maintain the results of that data and the history and the archives and all the reporting and everything. It’s a hybrid technology where the two need to work together.
Jason Lopez: With the rise of the cloud, many businesses rushed in. There were and are all sorts of very good reasons: reduced IT costs, scalability and flexibility and agility, access to big data analytics, access to AI. And then, simply, it was a trend. There was a sort of peer pressure to move to the cloud.
Rene van de Bedem: When they got there, they realized, “Oh, the business goals that we’ve been trying to achieve are actually not being met,” or they weren’t considered. Typically, cost and operational complexity, because there’s a level of skill that you need to have to work in a hyperscaler correctly, regardless of whether it’s Google, Azure, or AWS. It may turn out that the laws of the land, the laws of physics, or the laws of economics are very, very important to them. So they’re constrained as part of their business goals that they’re trying to achieve. And it turns out, “Okay, running in the cloud is not such a good idea. And then they’re forced to go back. So I have seen that a few times. Because obviously, you’ve got people, process, technology, and financials are the four major domains. And if one of those is weak, then you’re probably not going to be successful.
Jason Lopez: This is what hyperconverged infrastructure was born to do: to consolidate storage, compute, and networking resources into a single, easily managed platform which is software defined, and helps reduce capital and operational expenditures. Nutanix launched its first HCI product in 2011, focusing on making data center infrastructure invisible. Rene says that when it comes to virtualization he’s seen a variety of platforms such as VMware and HyperV, though the Nutanix platform offers a more user-friendly experience. Nutanix Cloud Clusters, also known as NC2, is aimed at more easily managing workloads on hybrid clouds. Rene’s job is to make sure these platforms work inside Azure.
Rene van de Bedem: The beauty of running NC2 on Azure or Azure VMWare solution, to use Microsoft as an example, because that’s all being extrapolated in the back end and the customer doesn’t see it, it’s a lot more easy for them to consume, because really the main requirement is “do you have an Azure landing zone” and then you can build whatever service that you want on it.
So, I’ve been working with Nutanix since 2014, and what I always respected about Nutanix was the fact that Nutanix was the company that invented hyperconverged infrastructure and it was really about the customer experience. And even with the CSAT scores and support, there is really no better support than Nutanix. So customers that buy into the Nutanix ecosystem, it’s similar to Apple fanboys and VMware fanboys, Nutanix customers are very passionate about the infrastructure and the solutions that Nutanix brings to market. And when you look at the evolution of NC2 on Azure, NC2 on AWS, that’s really just an extension of the things that the customer is asking for. They want to get out of the data center business, they want to move into the cloud, and that’s what Nutanix is doing with this multi-cloud strategy. And obviously VMware is going down a similar path as well. What makes Nutanix more interesting is that focus on customer experience and then also you have this Broadcom acquisition of VMware. At the moment, the market is very much in a state of flux and it’s not clear where the chips are going to fall. What does that mean for Nutanix? I think it puts Nutnaix in a very very interesting position. Because customers that don’t have visibility on where the platform that their mission critical and business critical apps are running, that’s a problem. If you introduce risk to the story, you’re going to have a lot of customers that are going to be looking to shift and change to mitigate that risk.
Jason Lopez: Rene van den Bedem is Principal Technical Program Manager at Microsoft. This is the Tech Barometer podcast, thanks for listening, I’m Jason Lopez. Tech Barometer is produced by The Forecast, where you can find more stories on technology. Check out Jason Johnson’s article which profile’s Rene, entitled, “Simplifying Hybrid Cloud and Migrations to Azure Public Cloud.” Just go to theforecastbynutanix.com. That’s theforecastbynutanix, all one word, dot com.
Editor’s note: Learn about Nutanix’s hybrid multicloud capabilities, compare offerings from VMware by Broadcom and Nutanix, see how to migrate to Nutanix then explore the VMware to Nutanix Migration Promotion.