AI PCs At The Edge Deliver Lower Costs, Reduced Energy Use
- by nlqip
Today’s AI PCs are moving power-hungry, GPU-driven AI training and inference tasks, leading to potential big performance and sustainability gains.
Today’s AI PCs are moving some of the power-hungry, GPU-driven AI training and inference tasks from the data center to the edge, leading to potential big performance and sustainability gains.
HP Inc. and Dell Technologies are among the PC makers driving this edge computing AI revolution.
Local inferencing has the advantage of latency and, for things like image and video creation, it can be five times faster than in the cloud, said Guayente Sanmartin, senior vice president and division president for Palo Alto, Calif.-based HP’s commercial systems and display solutions.
[RELATED: Sustainability Practices Driving Business For Solution Providers]
AI at the edge also results in lower costs and energy use as well as increased security and privacy since users’ prompts and data don’t leave their PCs. Inferencing with models locally on a PC can also create a more personal user experience with improved contextual awareness, Sanmartin said.
Technology, financial and consulting industries likely will be the first to experiment with AI PCs, although the first users potentially will be those who can quickly take advantage of AI models, Sanmartin said.
When it comes to sustainability, AI PCs are built around neural processing units that have been optimized for specific tasks, she said. The Snapdragon NPU in the HP EliteBook Ultra G1q uses 1 watt when fully loaded compared with typical PCs, which use 15 or 20 watts.
“When you do local AI workloads compared with cloud infrastructures, you are already substituting big cores used in the cloud to very small, local cores,” said Sanmartin. “Also, the new AI PC architecture has a blend of large and small cores, with workloads going to smaller cores [where possible] to optimize energy consumption.”
Charlie Walker, senior director and general manager of Precision workstations and rugged PCs at Round Rock, Texas-based Dell, said that a big problem with AI from a sustainability perspective is the need to push terabytes and petabytes of data to the cloud, which takes server resources to do, and which then requires the resources to store that data.
An alternative, Walker told CRN, would be to use a workstation to experiment and do fine-tuning on an AI model before deploying to the cloud or, better yet, to an AI PC.
Workstations and AI PCs each have their own sustainability advantage, Walker said. By starting the AI model on a workstation, the data remains closed to the outside and available for training locally before pushing it to a data center.
“If you want to go push it to your own data center, you really want to make sure that whatever you push for that training is what you want so you get back the expected results,” he said. “Meaning, ‘Do I have the right data? Do I have the right amount of data? Is the data in the right format? Do I understand the key variables and parameters for any given model?’ If someone hasn’t done that testing on a smaller device, it will require more iterations. So I can do that locally, with a lower-power device.”
An AI PC with an NPU is by far the most efficient way to process an AI model because that is how it is designed, Walker said.
“The GPUs used in more traditional tower workstations or in a data center are designed for general-purpose compute around graphics. You’re moving data around inside the GPU. You’re trying to figure out how to allocate resources. That all takes more power versus pushing that model to an AI PC where all of that ends up being hard-coded. All that real-time processing has been done beforehand.”
Another use case for AI PCs is where businesses are using GenAI as a way to augment the code-writing process to make it more efficient, Walker said.
“You can do all that on an NPU on the local AI PC,” he said. “People can leverage AI to supplement, augment and speed up their code development. AI PCs free up that data center capacity. And now it’s much more efficient for employees because it’s all local. There’s no network lag.”
Source link
lol
Today’s AI PCs are moving power-hungry, GPU-driven AI training and inference tasks, leading to potential big performance and sustainability gains. Today’s AI PCs are moving some of the power-hungry, GPU-driven AI training and inference tasks from the data center to the edge, leading to potential big performance and sustainability gains. HP Inc. and Dell Technologies…
Recent Posts
- Arm To Seek Retrial In Qualcomm Case After Mixed Verdict
- Jury Sides With Qualcomm Over Arm In Case Related To Snapdragon X PC Chips
- Equinix Makes Dell AI Factory With Nvidia Available Through Partners
- AMD’s EPYC CPU Boss Seeks To Push Into SMB, Midmarket With Partners
- Fortinet Releases Security Updates for FortiManager | CISA