Here's a story about how the AI gold rush is reshaping the memory chip business, and who gets to sell the shovels. The latest plot twist: Nvidia Corp. (NVDA) has reportedly picked its suppliers for the memory that will go into its next flagship AI chip, called Vera Rubin. And the winners are both from South Korea.
According to industry officials, Samsung Electronics Co. Ltd (SSNLF) and SK Hynix have secured the role of exclusive suppliers for the sixth-generation high-bandwidth memory, known as HBM4. This gives the two Korean giants a major leg up in the race to supply the brains for the next AI cycle, and it puts them ahead of Micron Technology Inc (MU), which has been left out of this particular deal.
Think of it this way: Nvidia's AI chips are incredibly powerful, but they need to be fed data incredibly fast to do their job. That's where HBM comes in—it's the super-fast, specialized memory stacked right next to the processor. Getting chosen to supply it for Nvidia's next big thing is like winning a golden ticket in the semiconductor world.
The Price of Being Premium
And with great demand comes great pricing power. The AI boom isn't just about who gets the contract; it's about what they can charge for it. Samsung has reportedly started mass production of its HBM4 chips and is in talks to sell them for about $700 per unit. That's roughly 20% to 30% higher than the price for the previous generation of HBM.
It's a sign of just how tight supply is in the market for premium AI memory. Samsung is also trying to play catch-up after initially falling behind SK Hynix in the HBM race. Now, both companies are competing to be the main supplier for Nvidia's Vera Rubin platform, which is expected to drive the next wave of AI hardware.













