Marketdash

Cerebras CEO Andrew Feldman On Taking On Nvidia, Building Sovereign AI, And The Road To An IPO

MarketDash
The CEO of the AI chip challenger valued at $23 billion talks about the shift from training to inference, why export controls worry him, and how close we are to AGI.

Get Market Alerts

Weekly insights + SMS alerts

Here's a sign the AI hardware game is changing: when Nvidia (NVDA) spends $20 billion to buy an inference-specialist like Groq. To Andrew Feldman, the co-founder and CEO of Cerebras Systems, that's not just a big deal. It's a strategic confession. The era where one type of chip—the GPU—could rule every part of the AI kingdom is over.

"We saw AI on the horizon… And we knew that the graphics processing unit would probably not be the right machine," Feldman said, recalling the company's founding insight in 2015. The idea was simple, if audacious: start from zero. "If we started with a clean sheet of paper and designed a solution optimized for AI, not for graphics, not for databases, not for web serving, but just for AI, we could build a better solution."

That bet is looking pretty good right now. Cerebras, with its revolutionary wafer-scale engine, just closed a Series H funding round at a $23 billion valuation. That's nearly triple its valuation from just months before. The round was led by Tiger Global and included a who's who of investors like Benchmark, Fidelity, and even AMD (AMD). This comes on the heels of the company confidentially filing for a U.S. IPO, targeting a public listing as early as April 2026.

Oh, and they also reportedly landed a deal with OpenAI worth over $100 billion to supply 750 megawatts of computing power through 2028. When the biggest name in AI starts writing very large checks to someone other than Nvidia, you know the competitive landscape is shifting.

I spoke with Feldman after this landmark round. We talked about everything from his boredom-driven path to founding the company, to the surprising geography of its growth, and the very pragmatic steps toward going public.

From Boredom to Building an AI Engine

Feldman's story doesn't start with a eureka moment in a lab. It starts with being bored. After selling his previous data center company, SeaMicro, to AMD, he found himself at a loose end. "I was bored," he stated simply. The spark came from colleagues with a prescient view of the future. They raised money with stunning ease in 2016. "We made eight presentations, we got eight term sheets, and so we started to go."

The Partnership That Proved the Point

While Cerebras now counts hundreds of customers, including recent deals with Meta, IBM, and Mistral, Feldman points to a partnership with Abu Dhabi's G42 as the catalyst that proved their system's transformative power. In fact, a staggering 87% of Cerebras's revenue for the first half of 2024 came from G42.

It started in 2023 with a demo. "We helped them solve a technical problem that had been taking months on GPUs, and we solved it in a few days. And this got them excited," Feldman said.

The relationship exploded. "We've built hundreds of exaflops of computers for them since then," he noted. This collaboration has created what Feldman calls "sovereign AI" outcomes for the region, including training the leading Arabic model, Jais, and serving advanced models like K2 Think. "And in return, they have consumed almost everything we could make for 2023 and 2024."

The Geopolitical Calculus: Keeping Allies in the U.S. Orbit

This deep partnership puts Feldman at the center of a heated debate about U.S. export controls on advanced technology. His argument is pragmatic, not political.

"I think that we should be working to support our allies. And the United Arab Emirates is clearly an ally of the US," he said. "We want them working in our ecosystem. We don't want them working in China's ecosystem. I think we should empower them. We should allow them to build sovereign AI. And I think that's true not just for here, but for Poland and for France, and for Mexico, and our allies around the world."

His message to policymakers in Washington is clear: being overzealous risks pushing allies into other tech spheres. "It's in our interest if they're using American-made infrastructure and using American-made models and using American infrastructure to make their own models."

The New AI Playbook: Inference, Efficiency, and Real-World Impact

Feldman sees the industry at an inflection point. The race is no longer just about who can train the biggest model. "Right now, inference time computes… improving the quality of the answer based on additional use of tokens through reasoning is an extremely powerful tool," he said, emphasizing that this is where immense value is now being created.

This shift comes with an urgent need for efficiency. "AI uses a lot of power. And that means we have an obligation to produce amazing results, to solve important societal problems." He positions Cerebras's specialized architecture as the answer. "I think our chips use a fraction of the power of GPUs."

For him, the proof is in the real-world applications. "Working with GlaxoSmithKline, we're designing new drugs with AI. With Mayo Clinic, we are doing personalized medicine. So based on your genomic information, we are predicting which drugs will be most effective for you. We're working with startups to build agents. We're working with mid-sized companies to write coding IDEs. It's an enormous spectrum and it's really fun to see all the different places AI can reach."

Scaling Up and Staring Down AGI

In a market hungry for Nvidia alternatives, Feldman is confident but realistic. "This is a huge market, there's lots of opportunity… I think there'll be many winners."

He's direct about the ease of switching for certain workloads. "To move from using NVIDIA GPUs, for inference, to Cerebras in our cloud will take about 10 keystrokes and should take you less than a minute. Training is a harder problem, but there too, within hours, you should be able to move a training workflow from a GPU cluster to Cerebras."

The new billion-dollar funding round has a clear purpose: "We're gonna dramatically increase manufacturing capacity, we're gonna open more data centers around the world, and we're going to continue to pursue extraordinary engineering ideas. Ideas that add 10 or 100x to our performance."

As for the ultimate goal of Artificial General Intelligence? Feldman is cautious with timelines. "I think we will in the next several years for many math problems exceed most humans' capacity at math. Is that AGI? I don't think so… But I think we're still eight or ten years away from sort of a general AGI that as a whole is superior."

The CEO's AI Stack and the IPO Path

Feldman is a power user of the technology he helps build. "I use GPT-5.0, I use Anthropic, I use CoinCoder 480B, I use OSS 120B and then I use every day some proprietary models that we built at Cerebras," he said, detailing how AI augments his day from writing emails to debugging code.

On the inevitable question of an IPO, he confirmed the process is active. "We withdrew our S1 because it was out of date and because we now had a new cap table with new investors. We will update it as quickly as we can and go forward towards an IPO." The company has since re-filed confidentially for its U.S. public offering.

As our conversation wrapped up, Feldman summed up the breakneck pace of the moment: "Right now, you want to spend every minute you can in AI because so much exciting is happening… It's really an extraordinary time."

In Andrew Feldman's view, the future of AI hardware is being written through bold partnerships, a commitment to sovereign capability, and specialized efficiency—not just bigger versions of yesterday's chips. With a $23 billion valuation, a clear path to the public markets, and a seat at the table with the industry's giants, Cerebras isn't just challenging Nvidia; it's trying to build a whole new center of gravity.

Cerebras CEO Andrew Feldman On Taking On Nvidia, Building Sovereign AI, And The Road To An IPO

MarketDash
The CEO of the AI chip challenger valued at $23 billion talks about the shift from training to inference, why export controls worry him, and how close we are to AGI.

Get Market Alerts

Weekly insights + SMS alerts

Here's a sign the AI hardware game is changing: when Nvidia (NVDA) spends $20 billion to buy an inference-specialist like Groq. To Andrew Feldman, the co-founder and CEO of Cerebras Systems, that's not just a big deal. It's a strategic confession. The era where one type of chip—the GPU—could rule every part of the AI kingdom is over.

"We saw AI on the horizon… And we knew that the graphics processing unit would probably not be the right machine," Feldman said, recalling the company's founding insight in 2015. The idea was simple, if audacious: start from zero. "If we started with a clean sheet of paper and designed a solution optimized for AI, not for graphics, not for databases, not for web serving, but just for AI, we could build a better solution."

That bet is looking pretty good right now. Cerebras, with its revolutionary wafer-scale engine, just closed a Series H funding round at a $23 billion valuation. That's nearly triple its valuation from just months before. The round was led by Tiger Global and included a who's who of investors like Benchmark, Fidelity, and even AMD (AMD). This comes on the heels of the company confidentially filing for a U.S. IPO, targeting a public listing as early as April 2026.

Oh, and they also reportedly landed a deal with OpenAI worth over $100 billion to supply 750 megawatts of computing power through 2028. When the biggest name in AI starts writing very large checks to someone other than Nvidia, you know the competitive landscape is shifting.

I spoke with Feldman after this landmark round. We talked about everything from his boredom-driven path to founding the company, to the surprising geography of its growth, and the very pragmatic steps toward going public.

From Boredom to Building an AI Engine

Feldman's story doesn't start with a eureka moment in a lab. It starts with being bored. After selling his previous data center company, SeaMicro, to AMD, he found himself at a loose end. "I was bored," he stated simply. The spark came from colleagues with a prescient view of the future. They raised money with stunning ease in 2016. "We made eight presentations, we got eight term sheets, and so we started to go."

The Partnership That Proved the Point

While Cerebras now counts hundreds of customers, including recent deals with Meta, IBM, and Mistral, Feldman points to a partnership with Abu Dhabi's G42 as the catalyst that proved their system's transformative power. In fact, a staggering 87% of Cerebras's revenue for the first half of 2024 came from G42.

It started in 2023 with a demo. "We helped them solve a technical problem that had been taking months on GPUs, and we solved it in a few days. And this got them excited," Feldman said.

The relationship exploded. "We've built hundreds of exaflops of computers for them since then," he noted. This collaboration has created what Feldman calls "sovereign AI" outcomes for the region, including training the leading Arabic model, Jais, and serving advanced models like K2 Think. "And in return, they have consumed almost everything we could make for 2023 and 2024."

The Geopolitical Calculus: Keeping Allies in the U.S. Orbit

This deep partnership puts Feldman at the center of a heated debate about U.S. export controls on advanced technology. His argument is pragmatic, not political.

"I think that we should be working to support our allies. And the United Arab Emirates is clearly an ally of the US," he said. "We want them working in our ecosystem. We don't want them working in China's ecosystem. I think we should empower them. We should allow them to build sovereign AI. And I think that's true not just for here, but for Poland and for France, and for Mexico, and our allies around the world."

His message to policymakers in Washington is clear: being overzealous risks pushing allies into other tech spheres. "It's in our interest if they're using American-made infrastructure and using American-made models and using American infrastructure to make their own models."

The New AI Playbook: Inference, Efficiency, and Real-World Impact

Feldman sees the industry at an inflection point. The race is no longer just about who can train the biggest model. "Right now, inference time computes… improving the quality of the answer based on additional use of tokens through reasoning is an extremely powerful tool," he said, emphasizing that this is where immense value is now being created.

This shift comes with an urgent need for efficiency. "AI uses a lot of power. And that means we have an obligation to produce amazing results, to solve important societal problems." He positions Cerebras's specialized architecture as the answer. "I think our chips use a fraction of the power of GPUs."

For him, the proof is in the real-world applications. "Working with GlaxoSmithKline, we're designing new drugs with AI. With Mayo Clinic, we are doing personalized medicine. So based on your genomic information, we are predicting which drugs will be most effective for you. We're working with startups to build agents. We're working with mid-sized companies to write coding IDEs. It's an enormous spectrum and it's really fun to see all the different places AI can reach."

Scaling Up and Staring Down AGI

In a market hungry for Nvidia alternatives, Feldman is confident but realistic. "This is a huge market, there's lots of opportunity… I think there'll be many winners."

He's direct about the ease of switching for certain workloads. "To move from using NVIDIA GPUs, for inference, to Cerebras in our cloud will take about 10 keystrokes and should take you less than a minute. Training is a harder problem, but there too, within hours, you should be able to move a training workflow from a GPU cluster to Cerebras."

The new billion-dollar funding round has a clear purpose: "We're gonna dramatically increase manufacturing capacity, we're gonna open more data centers around the world, and we're going to continue to pursue extraordinary engineering ideas. Ideas that add 10 or 100x to our performance."

As for the ultimate goal of Artificial General Intelligence? Feldman is cautious with timelines. "I think we will in the next several years for many math problems exceed most humans' capacity at math. Is that AGI? I don't think so… But I think we're still eight or ten years away from sort of a general AGI that as a whole is superior."

The CEO's AI Stack and the IPO Path

Feldman is a power user of the technology he helps build. "I use GPT-5.0, I use Anthropic, I use CoinCoder 480B, I use OSS 120B and then I use every day some proprietary models that we built at Cerebras," he said, detailing how AI augments his day from writing emails to debugging code.

On the inevitable question of an IPO, he confirmed the process is active. "We withdrew our S1 because it was out of date and because we now had a new cap table with new investors. We will update it as quickly as we can and go forward towards an IPO." The company has since re-filed confidentially for its U.S. public offering.

As our conversation wrapped up, Feldman summed up the breakneck pace of the moment: "Right now, you want to spend every minute you can in AI because so much exciting is happening… It's really an extraordinary time."

In Andrew Feldman's view, the future of AI hardware is being written through bold partnerships, a commitment to sovereign capability, and specialized efficiency—not just bigger versions of yesterday's chips. With a $23 billion valuation, a clear path to the public markets, and a seat at the table with the industry's giants, Cerebras isn't just challenging Nvidia; it's trying to build a whole new center of gravity.