AI is embedded in Intel’s chipmaking process from ‘front end silicon design, back end, software development, all the way to manufacturing’

While AI often hits the headlines for all the wrong reasons, Intel appears to be doubling down on the tech in multiple aspects of its chip business.

While AI often hits the headlines for all the wrong reasons, Intel appears to be doubling down on the tech in multiple aspects of its chip business.

AI has its uses. While it’s tempting to think of AI in terms of chatbots, image -generators and perhaps the thing that might end us all if we’re not careful, major tech companies have been busy implementing AI and machine learning optimisation into multiple aspects of their business, in a quest to integrate what used to be a fringe concept into the processes that create the products you may already own.

I recently had the chance to chat to Intel India President and VP & Head of Client Computing Group Sustainability, Gokul Subramaniam, in a wide ranging discussion about Intel’s sustainability goals, product life cycles and more. 

I took the opportunity to ask how Intel uses AI and machine learning models in regards to efficiency within its products, and whether AI was used to optimise the process.

“We have AI in engineering as a big focus area, starting from the front end of our silicon design at the RTL level, to the back end, which is basically post powering on the silicon leading it all the way to production readiness. And then in our software development and debug, we use AI as well as in our manufacturing use, and to test how we use AI.

“We also use AI when it comes to a lot of the telemetry data that we collect to figure out what decisions we can take from a usage standpoint and things like that. So it’s a big focus area across the lifecycle, front end silicon design, back end, software development, all the way to manufacturing.”

So it appears that, according to Intel, AI is already involved in a huge number of aspects of chip design and production. However, given that our discussion revolved around Intel’s goals towards sustainability, I also took the chance to ask him about his views on the scalability of AI, and the sustainability of the increasing power demands that come with it.

(Image credit: Intel)

“One of the things that Intel believes in is the AI continuum, from cloud data centre, to network edge and the PC. Now, what that means is, it’s an AI continuum, it’s not one size fits all, you cannot just have a single or one monolithic view of computing. There is a variety of needs in the AI continuum, the large and very large models, to the small and nimble models that can potentially be in the devices on the edge.

“So there’s everything from high performance computing, like the Argonne labs that are 10s of 1000s of servers, to on prem, smaller companies, that may need only a few Xeons or maybe Xeons and a few GPUs. And so we have this heterogeneous compute, that allows people to be able to do that to fit their power envelope and compute gains.”

While these are noble goals, it’s been difficult to ignore the recent headlines regarding concerns around the sustainability of an AI future that often involves huge amounts of processing power, and potentially astonishing amounts of actual power to match.

“AI is new, I don’t think anybody’s cracked the code,” Subramaniam says. “What that means to sustainability, it’s going to be a journey that we’re going to take and learn…as long as we’re very power efficiency focused and allow our customers to build with the technologies that we give them beyond the silicon, that’s kind of where it’s going to work.”

While data centers like the Argonne National Laboratory and its Aurora Exascale Supercomputer—built on Intel’s Xeon CPU and Data Center GPU Max series—are likely to also draw huge amounts of power, the assertion that Intel sees AI workloads overall as being spread across multiple different types of computing for different purposes strikes as an interesting distinction.

(Image credit: Intel)

With the rise of consumer chips featuring NPUs (neural processing units) designed specifically for AI processing, first seen in Intel’s Meteor Lake mobile CPUs and expected to also appear in the yet to be announced Arrow Lake desktop CPUs, it seems that the focus is not just on huge amounts of data centre AI compute ability, but AI processing spread across future chips as a whole.

Your next upgrade

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Given the rise of the “AI PC”, it certainly seems the chip industry is leaning heavily into the AI future, although there’s an argument to be made that any PC with a modern GPU is already equipped for AI processing. The current definition of an AI PC, however, seems to range from any PC delivering over 45 TOPs from a dedicated NPU, to having the appropriate sticker on your keyboard.

Regardless, it’s turtles all the way down for Intel it seems, or in this case, AI from top to bottom. It seems likely that your next CPU, or perhaps even the one you’re using right now, may well have felt the touch of artificial intelligence already, at least if it’s an Intel unit. 

About Post Author