Here's a thought: maybe the AI trade has been priced all wrong. Everyone's talking about silicon chips, but the real story might be about materials—specifically, a mineral called gallium that most people have never heard of.
"AI uses more minerals than most people realize," Harvey Kaye, Executive Chairman of U.S. Critical Materials, told MarketDash. He points to materials like gallium, which have "very few substitutes" if supply gets tight.
Gallium is one of those things that flies under the radar unless you're in a very specific circle. It sits at the heart of gallium nitride (GaN), a technology that's becoming crucial for high-efficiency power systems in next-generation data centers. And that's where Nvidia Corp (NVDA)'s world starts bumping into a potential bottleneck.
Gallium's Quiet Entry Into The AI Stack
Nvidia is working with Navitas Semiconductor on next-generation 800V HVDC infrastructure, which brings GaN into sharper focus. These systems are designed to make power delivery in AI data centers more efficient—and that's becoming just as important as the compute power itself.
Think about it: AI isn't just about chips anymore. It's about moving and managing enormous amounts of power. GaN, built on gallium, is turning into a key player in that shift.
Here's Where The Risk Sharpens
Now for the tricky part: gallium supply—and more importantly, processing—is heavily concentrated. "China controls much of the global processing capacity," Kaye notes, adding that "even small disruptions push prices up quickly."
This isn't just a theoretical concern. Beijing started restricting gallium exports back in 2023 and later tightened those controls, showing it's willing to use critical minerals as leverage.
For Nvidia, the exposure isn't at the chip level—its GPUs are still silicon-based and made by Taiwan Semiconductor Manufacturing Co. Ltd. (TSM)—but deeper down in the infrastructure layer that powers AI's expansion. If gallium supply tightens, the ripple effects could show up in power systems, deployment costs, and ultimately, how fast data centers can scale.
The result is a risk that doesn't sit on the surface of the AI narrative but runs underneath it. And in a market obsessed with compute supremacy, it's the inputs nobody's watching that can tighten the fastest.