
Credit: Andrew Feldman(co-Founder&CEO Cerebras) and Anna Tutova(Founder AI Crypto Minds).
The AI hardware industry is experiencing a foundational shift. The recent landmark $20 billion agreement between NVIDIA and inference-specialist Groq is more than a deal – it’s a strategic admission that the era of GPU dominance for all AI workloads is ending. In this new landscape, where high-speed, efficient model inference is becoming the critical bottleneck and a vast market opportunity, companies built with specialized architectures are poised to thrive.
One such company, Cerebras Systems, is moving decisively to capitalize on this inflection point. With its revolutionary wafer-scale engine, the company has solidified its position as a leading disruptor, recently closing a Series H round at a staggering $23 billion valuation, nearly triple its previous valuation from just months earlier. This round was led by Tiger Global, with participation from Benchmark, Fidelity Management & Research Company, Atreides Management, Alpha Wave Global, Altimeter, AMD, Coatue, and 1789 Capital among others. This renewed momentum follows its earlier $1.1 billion Series G round at an $8.1 billion valuation and comes as the company has confidentially filed for a US IPO, targeting a public listing as early as April 2026.
Adding to its momentum, Cerebras recently secured a landmark deal with OpenAI, reported to be worth over $100 billion, to provide 750 megawatts of computing power through 2028 a clear signal that major AI players are actively seeking alternatives to Nvidia’s hardware.
I sat down with Cerebras’ co-founder and CEO, Andrew Feldman, following this landmark round. In a wide-ranging conversation, he detailed the company’s path from a clean-sheet idea to a global hardware disruptor, the surprising epicenters of its growth, and the pragmatic steps toward an impending IPO.
From Boredom to Building the Engine of AI
Andrew Feldman’s path to founding Cerebras wasn’t linear. After a successful exit of a previous data center company SeaMicro to AMD, he found himself at a crossroads. “I was bored,” he stated simply. The spark came from colleagues in 2015, with a prescient insight about the future, fueled by a recognition that existing compute was fundamentally misaligned with the coming AI wave. “We saw AI on the horizon… And we knew that the graphics processing unit would probably not be the right machine,” he said. The insight was to start from zero. “If we started with a clean sheet of paper and designed a solution optimized for AI, not for graphics, not for databases, not for web serving, but just for AI, we could build a better solution.”
That conviction attracted immediate backing. “We went out to raise money in March 2016. We made eight presentations, we got eight term sheets, and so we started to go.”
Global Adoption and Partnerships
While Cerebras now counts hundreds of customers worldwide, including recent mega-deals with Meta, IBM, and Mistral in 2025, Feldman highlighted a catalytic partnership with G42 that proved the system’s transformative potential. 87% of Cerebras` revenue for the H1 2024 came from G42. In 2023 they launched with G42 a super computer called Condor Galaxy.
It began in 2023 with a simple demonstration. “We helped them solve a technical problem that had been taking months on GPUs, and we solved it in a few days. And this got them excited.”
The relationship exploded from there. “We’ve built hundreds of exaflops of computers for them since then,” Feldman said. This collaboration has produced tangible, sovereign AI outcomes for the region. “We’ve trained the leading Arabic model, Jais with their partner, Mohamed bin Zayed University of Artificial Intelligence MBZUAI. We currently serve K2 Think, which is sort of one of the leading thinking models… It trains thinking models, instructs models. It’s really been an extraordinary partnership. And in return, they have consumed almost everything we could make for 2023 and 2024.”
A Strategic Calculus: Keeping Allies in the US Tech Orbit
Feldman, operating at the intersection of cutting-edge tech and global alliances, is deeply concerned about restrictive US export controls. His argument is pragmatic, not political.
“I think that we should be working to support our allies. And the United Arab Emirates is clearly an ally of the US. We want them working in our ecosystem. We don’t want them working in China’s ecosystem. I think we should empower them. We should allow them to build sovereign AI. And I think that’s true not just for here, but for Poland and for France, and for Mexico, and our allies around the world.”
His message to Washington is clear: overzealous regulation risks ceding influence. “It’s in our interest if they’re using American-made infrastructure and using American-made models and using American infrastructure to make their own models.”
The New AI Playbook: Inference, Efficiency, and Impact
Feldman sees the industry at an inflection point, moving beyond pure training scale. “Right now, inference time computes… improving the quality of the answer based on additional use of tokens through reasoning is an extremely powerful tool,” he noted, emphasizing this is where much of the race’s value is now being created.
This comes with an urgent need for sustainable scaling. “AI uses a lot of power. And that means we have an obligation to produce amazing results, to solve important societal problems.” He positions Cerebras’ architecture as inherently more efficient. “I think our chips use a fraction of the power of GPUs.”
The tangible impacts, for him, are what matter most. “Working with GlaxoSmithKline, we’re designing new drugs with AI. With Mayo Clinic, we are doing personalized medicine. So based on your genomic information, we are predicting which drugs will be most effective for you. We’re working with startups to build agents. We’re working with mid-sized companies to write coding IDEs(Integrated Development Environments). It’s an enormous spectrum and it’s really fun to see all the different places AI can reach.”
The Road Ahead: Scaling and the AGI Horizon
In a market seeking alternatives, Feldman is confident but clear-eyed. “This is a huge market, there’s lots of opportunity… I think there’ll be many winners.” When asked about luring customers from NVIDIA, Feldman is direct about the simplicity of switching for inference workloads. “To move from using NVIDIA GPUs, for inference, to Cerebras in our cloud will take about 10 keystrokes and should take you less than a minute. Training is a harder problem, but there too, within hours, you should be able to move a training workflow from a GPU cluster to Cerebras”
The massive new funding round has a direct purpose: “We’re gonna dramatically increase manufacturing capacity, we’re gonna open more data centers around the world, and we’re going to continue to pursue extraordinary engineering ideas. Ideas that add 10 or 100x to our performance.”
As for the holy grail of AGI? Feldman demurs on a simple timeline: “I think we will in the next several years for many math problems exceed most humans’ capacity at math. Is that AGI? I don’t think so… But I think we’re still eight or ten years away from sort of a general AGI that as a whole is superior.”
The Personal AI Stack and the IPO Horizon
Feldman is a hands-on user of the technology he builds. “I use GPT-5.0, I use Anthropic, I use CoinCoder 480B, I use OSS 120B and then I use every day some proprietary models that we built at Cerebras”. His daily tasks are augmented by AI, from email generation and deep research summaries to debugging code. “I’m trying to use it as much as I can every day.”
As for the inevitable question of an IPO, he confirmed the process is actively underway. “We withdrew our S1 because it was out of date and because we now had a new cap table with new investors. We will update it as quickly as we can and go forward towards an IPO.”
And for the question every investor is asking: after the mega-round, what about an IPO? Feldman confirms the path is being cleared. “We withdrew our S1 because it was out of date and because we now had a new cap table with new investors”. Recently Cerebras filed again for its IPO in the US.
As our conversation ended, Andrew Feldman summed up the moment: “Right now, you want to spend every minute you can in AI because so much exciting is happening… It’s really an extraordinary time.”
In Andrew Feldman’s view, the future of AI is being written not just with bigger chips, but through bold international partnerships and a fierce commitment to sovereign capability. With a fresh $1 billion in funding, a clear runway to an IPO, and a massive deal with OpenAI, Cerebras is building more than just models, they’re building a new center of gravity in the AI hardware world.
Benzinga Disclaimer: This article is from an unpaid external contributor. It does not represent Benzinga’s reporting and has not been edited for content or accuracy.
Recent Comments