So, you thought the Nvidia story was just about selling a ton of AI chips? Think bigger. That's the message from Bank of America analyst Vivek Arya, who took a fresh look at the company following its GTC 2026 conference and came away more bullish than ever. He's sticking with a Buy rating and a $300 price target, arguing that the chipmaker's role is expanding far beyond its core hardware.
The headline number is eye-popping: Nvidia is looking at over $1 trillion in data center revenue from 2025 through 2027. But here's the kicker—Arya says that massive figure might only be about two-thirds of the real opportunity. It excludes things like Nvidia's own CPUs, its storage systems, and newer rack-scale offerings. Add those in, and the total addressable market could be roughly 50% larger. In other words, as AI adoption scales, Nvidia is positioning itself to address a much bigger pie than most people are counting.
It's Not Just Volume, It's Profitability
It's one thing to sell a lot of stuff; it's another to make good money doing it. Arya highlights that Nvidia is getting more efficient, driving down the cost of generating AI outputs to about $6 per 1 million tokens. More interestingly, the company is pushing into a new category of ultra-low-latency workloads with its LPX systems. This isn't just another product line—Arya estimates it could represent about 25% of the market and, crucially, come with "significantly higher profitability" compared to other segments. So, the growth story isn't just about shipping more units; it's about moving into more lucrative parts of the AI stack.
The Roadmap and the Customers Are Evolving
What's next on the tech side? Arya points to Nvidia's evolving roadmap, which includes a near-term shift toward combining copper and optical technologies in its systems, with a future move to fully advanced optical solutions. But the customer base is changing too. The story is no longer just about selling to the giant cloud providers like Amazon and Microsoft. Enterprise and sovereign customers—think big corporations and national governments building their own AI infrastructure—are expected to account for a growing share of AI workloads over time. That diversification supports what Arya sees as sustained long-term growth.













