Breaking Update: Here’s a clear explanation of the latest developments related to Breaking News:What exactly makes an AI PC fit for the enterprise? • The Register– What Just Happened and why it matters right now.
Sponsored Feature Artificial intelligence is ubiquitous in the enterprise technology world. Whether AI PCs are ubiquitous is an open question.
The fact is most enterprise personal devices are running applications or services that contain at least elements of AI. But that doesn’t mean they are truly tailored or tuned for supporting AI, not least when those services run on the device itself.
A given client device might have an abundance of processing horsepower that can theoretically run a small language model or crunch large amounts of data. But if that means its battery drains in an hour or running a quick security scan bring it to a crawl, the user is no better off, particularly when it comes to enterprise workloads.
So how do we identify a truly capable AI PC, and what can we expect to find under the hood?
Microsoft’s Copilot function is arguably the most widely deployed gen AI technology in the enterprise. So, its Copilot+ PC definition is a good starting point for working out just what an AI PC needs to offer.
On top of the minimum requirements for a Windows 11 PC, including a 1GHz CPU and Trusted Platform Module v2.0, the Copilot+ standard calls for at least 16GB of DRAM, 256GB of SSD storage, and a Neural Processing Unit capable of more that 40 trillion operations per second [TOPS].
There’s no doubt that’s a powerful baseline, but in a rapidly evolving market, is that enough for enterprise users to gain the full benefits of AI from the comfort of their own laptop?
Intel partner technical sales specialist Jimmy Wai says when it comes to supporting enterprise AI, it’s not enough to simply focus on the NPU. “It’s about the CPU, the GPU and NPU, because each of those components is good at certain AI tasks,” he says.
So, when it comes to delivering the optimal enterprise AI experience, “It really depends on what the workload is, and the developers can choose what is the best xPU to run their workload.”
While Copilot or Gen AI tasks might impose a heavy load on the NPU, he says, other key workloads will call on other components.
Graphics heavy services such as Zoom or other collaboration platforms, will impose a heavy burden on the graphics components, or the CPU. But they will also draw on the NPU for accompanying AI assistant features. Likewise, security scans in memory can be taxing for the GPU or CPU. And while GPUs are critical for building language models or training, real-time inference work is often best handled by the CPU.
Underlying all of these is the energy question. AI workloads are often intense. At the same time, the ability to do full day’s work on battery power is a key aim for enterprise users. That no longer means supporting standard desktop packages for word processing or spreadsheets. That full day’s work can now involve multiple video sessions, heavy use of Copilot or other Gen AI tools, and intense data processing.
You’re the TOPS
Once this is grasped it becomes clear that NPU performance alone will not necessarily satisfy the diverse demands that enterprise AI will impose on client hardware in the real world.
It’s that combination of demands and workloads that has informed the latest iteration of Intel’s second-gen Core Ultra processors, Wai explains, which goes above and beyond the 40 TOPS baseline set by Copilot+.
“We focus on being a really energy efficient mobile platform that can give customers really great performance,” he says. “But also at a very low energy power envelope.”
In addition to up to ten x86 cores – four performance cores, four efficiency cores and two lower power efficient cores – the platform’s tile architecture carries up to eight Intel Arc Graphics Xe graphics cores, and an Intel AI Boost NPU.
The NPU itself delivers 48 TOPS, comfortably clearing the Copilot+ baseline for AI assistant focused tasks. But the combined platform delivers 120 TOPs, 5 from the CPU, and 67 from the graphics package.
That aggregate processing power means it can support key AI workloads under the Copilot+ definition without compromising performance when it comes to broader enterprise tasks.
For example, the CPU’s 5 TOPS means it can comfortably handle real-time inferencing work – a role that is often more efficiently handled by the CPU than other components. Likewise, the GPU’s 67 TOPS, as well as supporting video tasks, also makes for much more efficient at security scanning, dramatically easing the load on the CPU.
At the same time, the overall package has been engineered for energy efficiency. It is built on TSMC’s 3nm node, which like any process shrink improves energy efficiency. And the x86 cores offer much more granular clock speed management, rather than simply dialing up and down in 100MHz increments.
“With a more granular clock frequency, you can really fine tune that to the workload,” says Wai.
Another big departure is that DRAM is directly integrated onto the package, with a choice of 16GB or 32GB, which further boosts energy efficiency, and which also enhances security.
As an enterprise platform, it supports Intel’s vPro technology, which offers Intel Active Management Technology. This allows the system to be managed and supported remotely, even when the operating system is inaccessible. This becomes critical in the event of a breach or major outage, allowing support teams to reach affected client devices and help them recover, Wai explains. “Blue Friday last year is a really good example.”
Enter software
But AI is about more than just hardware alone. Wai points out that Intel supports common AI development frameworks and works closely with ISVs to tailor systems and ensure both overall compatibility, and that their products can make optimal use of the architecture. So, they are not required to lock themselves into a software walled garden to make full use of the platform.
That ensures energy efficiency on the client device, but by reducing the need for data to be uploaded to the cloud for processing, it boosts security and reduces the energy and resource burden on software and service providers.
Likewise, says Wai, Intel’s platform is, naturally, natively x86. Some newer PC platforms require application translation, which will soak up resources and energy. Wai points out that these compatibility issues are, if anything, a bigger problem when it comes to drivers which require direct access to the hardware. This can be a shock for enterprises who, over the last few decades, have become used to peripherals and other components just working.
The result of all this, says Wai, is that enterprise users will have a far more responsive experience on second-gen Core Ultra-based systems.
But, more importantly, users gain much improved battery life. This includes enterprise builds, which typically introduce many more background processes, whether for DRM, security, device management, or policy enforcement.
“We have heard a lot of feedback from customers that they finally can make the laptop last a full day,” says Wai. “Even with the enterprise build.”
This might seem academic with many companies still working out their broader AI strategies. But it’s imperative they understand exactly what it takes for their employees to benefit from the technology.
“So, what do you want to run in the future? Do you have a road map, or do you have some target software that you want to run?” Wai asks. “If you do, then you need to make sure what you buy is capable of running those workloads.”
But he continues, “If you don’t have a target right now and you want to be ready, then you better look for the AI PC platform that gives you the best flexibility and competitive compatibility for the future.” Because the future will be here before you know it.
For more on Intel AI PC’s for business, click here.
Sponsored by Intel.
