Dell’s Arthur Lewis: The AI Revolution Requires A ‘Rearchitecture Of The Data Center’
- by nlqip
‘Data centers have been built the way they have been built for the last 30 years. That isn’t going to get us to the next 30 years. That is going to have to change,’ Arthur Lewis, president of Dell’s Infrastructure Solutions Group, tells CRN.
Arthur Lewis, president of Dell Technologies’ Infrastructure Solutions Group, said the AI revolution is here and the proving ground for whether it can scale globally is the data center, which as it sits now is not ready.
“So if you think about the data today, how much data sits in a hot tier, versus a warm tier, versus a cold tier? All that is going to change, where data is mostly in hot and warm tiers, constantly in circulation, feeding these engines,” Lewis said. “If you believe that, then there has to be a dismantling of silos of the past where everything is connected. There will be a rearchitecture of the data center. It doesn’t necessarily mean, ‘Hey, rip everything out and put all this new stuff in.’”
More of the large-scale compute and storage that is sold around the world flows through Lewis’ division, with Dell Technologies leading the market in both categories.
During its last quarter, AI-optimized server revenue increased to $2.6 billion, while shipments of that same compute increased 100 percent quarter over quarter to $1.7 billion.
Dell’s AI-optimized server backlog at the end of the first quarter was $3.8 billion, which is a fraction of Dell’s pipeline for those devices, the company said. Wall Street analysts were concerned about Dell’s cost to get those deals eating into margins. On the earnings call May 30, Dell Vice Chairman and COO Jeff Clarke acknowledged the costs were high but said the return would be worth it.
Lewis told CRN that Dell Technologies is thinking long term.
“There’s a lot of historical precedent to how we’re thinking about this,” he told CRN. “I wouldn’t look at it as right or wrong. Are we directionally correct? It’s like putting a puzzle together without the box. But we’re having enough conversations with partners, and conversations with customers, and we know enough about history to say, ‘Hey, based on all the data we have today, this is the best view we have today of what the future looks like.’”
Part of that future involves overcoming significant choke points to modernize computing infrastructure, electricity for one.
Dell’s chief AI officer, Jeff Boudreau, told CRN that in two to three years there may not be enough electricity in the data center to support GPU growth since there isn’t even enough for the AI systems that are currently on order.
“If you look at some of the backlogs coming out of what we have and Nvidia has, there’s not enough power and cooling to support that,” Boudreau said. “So one of the big concerns, and where I think they’re going to is, ‘Yes, there is a need to rethink the way we do data centers. Historically, what got us there isn’t going to get you there.’”
Here is more from CRN’s interview with Lewis at Dell Technologies World.
I talk to people in the data center space and they don’t seem ready to replace their entire hardware stack. Then you have the power requirements and the space requirements. What is the message regarding the data center for the AI era?
What we’re trying to do is paint a picture that there is a revolution around AI that is very similar to the Industrial Revolution. The Industrial Revolution was 50, 60 years. This might be half of that.
If you understand what is going on with artificial intelligence, specifically Generative AI, today, it is sort of a stand-alone workload, and you still have a lot of workloads that go along with it. But it doesn’t stretch the imagination to say, with the ubiquity of AI, it’s going to become much more prevalent in the data center.
OK, so what are the implications of that? The implications of that are that infrastructure needs a lot of data. So if you think about the data today, how much data sits in a hot tier, versus a warm tier, versus a cold tier. And all that data is going to change, where data is mostly in hot and warm tiers, constantly in circulation, feeding these engines. If you believe that, then there has to be a dismantling of silos of the past where everything is connected.
There will be a rearchitecture of the data center. Doesn’t necessarily mean, ‘Hey, rip everything out and put all this new stuff in.’
How much time is it going to take to rearchitect the data center?
Like Michael [Dell] said on stage, ‘How big is the opportunity? It’s hard to tell.’
This is clearly not going to happen over the next five years. This is probably a multi-decade journey, but it’s hard to tell. Once people realize the benefits of AI and really understand the use cases, a lot of things could accelerate very quickly if the economics make sense. This all kind of comes down to economics. They deploy it. They see the economics of it. They’re going to invest more into it and that transition will happen a lot faster.
It’s not going to happen in the next three to five years.
Does that make sense?
Yes, it does. But the other part of this is the compute, right? Like we as humans are not using less compute and all of the applications that are coming online are using more compute, it’s just not slowing down.
Right. I mean, AI was sort of the catalyst for it, but we started to realize, actually Jeff Clarke did, that in order to make good use of AI, you really have to modernize your business processes. He talked a lot about streamlining, standardizing and automating, which requires a massive transformation of business operations.
We just celebrated our 40th anniversary. What Michael and Jeff realized is that over the last 40 years, we have built a lot of processes that are not going to get us to the next 40 years. We have to streamline all of those processes, reinvent ourselves in order to really make use of this new technology.
It’s the same thing in data centers. Data centers have been built the way they have been built for the last 30 years. That isn’t going to get us to the next 30 years. That is going to have to change. It’s not going to be a light-switch transition, the way we’re reinventing ourselves is not a light-switch transition, but we’re driven by the economics of it, and saying, ‘Wow, if we transform the business, there’s a massive economic benefit, so we’re making the investment to make that happen.’
Companies are going to see the massive economic benefit and accelerate to make that happen.
Michael (pictured) had this great analogy on stage where he said when electricity came along, rather than try to design new processes to incorporate the technology, all the water-driven mills instead switched their wheels to electric motors. At the same time Dell is saying we need to rip and replace the trillion-dollar data center market and replace it with more powerful GPU architecture. Isn’t that the same thing as replacing a water wheel with an electric motor to power the water wheel?
Exactly. Yes. Again, there’s a lot of historical precedent to how we’re thinking about this. I wouldn’t look at it as right or wrong. Are we directionally correct? It’s like putting a puzzle together without the box.
But we’re having enough conversations with partners, and conversations with customers, and we know enough about history to say, ‘Hey, based on all the data we have today, this is the best view we have today of what the future looks like.’
Every time I meet with a customer, I feel more and more confident in the story that we’re telling in terms of how customers are thinking about artificial intelligence, how they want to deploy it, then we start asking them questions.
They start saying, ‘Whoa. There is going to be an impact here. I’m going to need some help with data center design because I may need to be rearchitecting how things are set up.’
There does seem to be some maturing happening in the messaging and go to market around GenAI comparing [Dell Technologies World] shows, last year versus this year. It does seem like you are starting to deliver actionable insights to partners. Are you getting that feedback from partners where they are saying, ‘I can actually use this? I can actually sell this?’
Hopefully you heard professional and consulting services at least 89,000 times over the last two shows.
From day one we said, ‘This does not work without strong professional and consulting-level services, and we cannot deliver that without the help of the partner community.’
Doug [Schmitt, president of Dell Technologies Services] has more than 100 partners involved in how we think about the professional and consulting services across the entirety of the stack–from day zero, to day two plus–that these customers are going to meet.
Set aside power and cooling for a second, the No. 1 objection issue we hear from customers is, ‘I don’t have the intellectual skill set to deploy Generative AI. This is something that is very new to us. These skills are not everywhere. The technology is moving very quickly. I don’t want to stand up a new ITOps team. I need help here.’
That’s why we are very focused on delivering solutions. Partners can manage this solution. There are all sorts of white-glove deployment services, management services. I think there’s a plethora of opportunities for channel partners.
This is where people are going to have to get creative and innovative and start thinking about, ‘These are the problems that customers are going to have.’
We’re going to have to start crafting solutions to meet their needs.
We are offering partners an incredible platform for them to be creative and figure out what are the solutions that they want to deliver to their customers. Just like every other part of our business, this is going to be a very symbiotic relationship with the channel community in servicing the needs of the end customer.
Even before GenAI, data centers were already booked, so the markets were already coming your way. So how excited are you?
Have you seen us? Is there not a pep in my step? (Laughter)
It’s compute. It’s network. It’s storage. It’s the solution. The software. It’s the professional services.
At the end of the day, the opportunity is massive. What we are after is we want to be able to talk to an enterprise customer and say, ‘What do you want? Leave it with us.’ The customer doesn’t have to worry about silicon diversity, network diversity or storage architecture.
You don’t need to build a whole other AIOps team for AI. ‘Tell us the outcome you want. Leave it with us. From the desktop to the data center to the cloud, what is the outcome that you’re looking for?’
We’ve talked about inferencing orchestration. It’s incredibly important if you are running models on a PC. You want to orchestrate the knowledge that is being created by the models. … We can do that for you. We can do that through Hugging Face.
This is the concept of the AI Factory. The entire estate is within the AI Factory and we want to serve the outcomes that you are looking for.
So the questions that we’re getting now are, ‘What outcomes should I want?’ Oh, let’s talk about that. Then we start to talk about outcomes. What is the ‘art of the possible’ here? Then every answer leads to more questions and more answers and more questions. It’s a super-interesting time. Lightbulbs are going off. ‘We can do this?’ and yes, ‘Actually you can do this.’
Is there going to be a data threshold? The way this produces data, is it even possible to keep up with that demand?
The answer is in the near term, mathematically, yes. Over time there’s an infinite growth opportunity for data, but I think you are going to see there are going to be different ways that people think about data because if you believe in the ubiquity of AI, data is going to feed the model. The model is going to checkpoints. It’s going to continue to fine-tune. It’s going to continue to get better over time. Once the model is trained and once it’s ‘hardened,’ for lack of a better term, the data that it was trained on isn’t needed anymore. Regulated industries, maybe, but not for a lot of other companies. This is going to take several years for models to harden and people to be comfortable with this, but once the data is trained and working well, and you have the ability to replicate that model [it becomes] … ‘I have a petabyte of data. What do I need it for? Now I put that data into a model and I used it and I got insight out of it, do I need it any longer?’ There might be pieces of it that I need, but your whole data strategy is going to change.
This is a super interesting space, because I don’t know if ‘limitless’ is the right word, but it really feels like the opportunities are limitless.
Source link
lol
‘Data centers have been built the way they have been built for the last 30 years. That isn’t going to get us to the next 30 years. That is going to have to change,’ Arthur Lewis, president of Dell’s Infrastructure Solutions Group, tells CRN. Arthur Lewis, president of Dell Technologies’ Infrastructure Solutions Group, said the…
Recent Posts
- A Vulnerability in Apache Struts2 Could Allow for Remote Code Execution
- CISA Adds One Known Exploited Vulnerability to Catalog | CISA
- Xerox To Buy Lexmark For $1.5B In Blockbuster Print Deal
- Vulnerability Summary for the Week of December 16, 2024 | CISA
- Arm To Seek Retrial In Qualcomm Case After Mixed Verdict