Micron is one of the big players in the memory business, including NAND and DRAM for SSDs and RAM, respectively. So, when VP and GM of Micron’s Consumer & Components Group, Dinesh Bahal, tells you what the situation is in the market right now, you should probably listen closely.
“All three of us the big three, so Hynix, us, Samsung, are spending a lot of energy and effort towards building HBM products,” Bahal says.
HBM, or High Bandwidth Memory, is a product used alongside data centre GPUs. If the name wasn’t a given, its specialty is massive bandwidth, which is especially important to compute. You might remember AMD once attempted to use HBM memory in a gaming graphics card, Vega, though it ultimately was deemed too expensive and not necessarily beneficial enough to work versus GDDR5/6.
So, what’s HBM memory got to do with PC builders if it’s not even used in gaming graphics cards?
“The reason that’s all important is there’s a lot of investment going in here. And that investment is really going to impact the supply demand balance, which may not impact your readers in the short term in terms of pricing issues, et cetera. But that is really what we believe is going to continue to happen over the course of the next few years.”
I’m told this has already been happening over the last nine or 10 months, which has coincided with an increase in prices for storage after being so low for so long.
“It will have an impact on the client side. From a demand perspective, or from a supply perspective.”
Essentially, the focus has shifted so massively towards HBM that it will have an impact on Micron’s, and potentially other memory supplier’s, other products, including DRAM and NAND.
(Image credit: Future)
“A bunch of consumers are in a sticker shock kind of environment of ‘hey, RAM prices have always gone down. NAND prices have always gone down. I could buy a terabyte for 50 bucks. That terabyte is now 80 bucks, what’s going on?'”
You guessed it: AI.
“Anybody who’s talking AI, anybody who’s doing AI, memory becomes at the core of it, as opposed to sitting at the edge of it. But now it’s like, without memory, it ain’t gonna happen.”
The high demand for memory to power AI, according to Micron, is going to impact the wider supply in terms of where the focus is on the manufacturing side and how capacity is spent. Though it might also increase demand for memory on the client side, as AI PCs demand higher memory requirements than most.
For example, Microsoft now won’t certify an 8GB laptop for Copilot+. Similarly, Micron says its LPCAMM2 memory modules are poised for AI on the desktop, and so far only come in 32/64 GB capacities.
“There are three things that AI is hungry for… bandwidth, capacity and power,” Praveen Vaidyanathan, from Micron’s Computer Networking business unit, tells me.
(Image credit: Jacob Ridley)
Catch up with Computex 2024: We’re on the ground at Taiwan’s biggest tech show to see what Nvidia, AMD, Intel, Asus, Gigabyte, MSI and more have to show.
While we’re yet to see a must-have feature for AI on any PC, manufacturers are massively keen to whack a sticker on their products with those two letters. So, even if you’re not interested, it might not matter what you think.
“Memory’s become sexy again,” Bahal says.
Though Micron would be keen to talk up the important of memory, demand is surely set to skyrocket with AI. Most PC and component manufacturers are looking for any way to fight the slump and reignite record sales after a couple bad years. AI is widely regarded as the best way to do just that, and it does gobble up lots and lots of memory.