Advertise With Us Report Ads

High-Capacity HBF Memory Finds a New Home with Google, Not Nvidia

LinkedIn
Twitter
Facebook
Telegram
WhatsApp
Email
Nvidia
From gaming to AI, Nvidia drives visual computing innovation. [TechGolly]

A new type of memory called High-Bandwidth Flash (HBF), which offers much more storage than the current HBM, won’t be used by Nvidia. Instead, Google plans to be a key buyer of this technology. NAND DRAM has become very important with the recent boom in AI. While mostly used for storage in devices like SSDs, this upcoming flash memory could play a big role. HBF is the next generation of NAND DRAM, bridging the gap between HBM and regular NAND Flash.

SanDisk and SK Hynix are working together to develop HBF. It will be built much like HBM, by stacking many layers of NAND Flash on top of each other. Each layer will connect using tiny pathways called TSVs, joining all the NAND parts into a single stack. While HBM currently offers 32-64 GB per stack, HBF could reach capacities of up to 4 TB.

ADVERTISEMENT
3rd party Ad. Not an offer or recommendation by hardwareanalytic.com.

HBM is faster, but HBF, with smart design, will provide enough speed for important AI tasks. HBF is perfect for “inferencing” workloads, which are very important now with the rise of “Agentic AI.” Its larger capacity also helps with some storage limitations on the main computer chip.

Even though HBF sounds great, industry insiders say Nvidia doesn’t plan to use this new memory anytime soon. Nvidia believes that special SSDs (eSSDs) can handle its capacity and speed needs. Nvidia is reportedly collaborating with Kioxia to create PCIe Gen7 SSDs that are up to 100 times faster than current designs.

ADVERTISEMENT
3rd party Ad. Not an offer or recommendation by hardwareanalytic.com.

Right now, SK Hynix is leading the development of HBF, with the first samples expected to be ready later this year. Google is reportedly going to be a major customer for HBF. The company plans to use this technology for its fast-growing AI efforts. Google’s TPU (Tensor Processing Unit) system is expanding quickly, and they are boosting their computing power with several new TPU solutions in the works.

It remains to be seen if HBF becomes a widely used technology, but it offers a big advantage beyond replacing HBM: it could replace standard DDR memory. Servers have started using more LPDDR memory as CPUs become the limiting factor in AI. This has created a large demand for LPDDR5 and LPDDR5X memory. With HBF’s stacked design, chip makers and AI companies can save space on circuit boards, add more storage, keep power use low, and maintain high speeds.

ADVERTISEMENT
3rd party Ad. Not an offer or recommendation by hardwareanalytic.com.
ADVERTISEMENT
3rd party Ad. Not an offer or recommendation by hardwareanalytic.com.