Here’s what you need to know

Cerebras Systems’ monster debut on Thursday not only ranked among the largest IPOs in technology – it was a clear signal of the inexorable demand for chips to power AI, as the tech giants try to find alternatives to expensive, sold graphics processors made by Nvidia.
Cerebras closed its first day of trading on Wall Street with a market value of less than $100 billion, putting it next to several companies to close above that mark, such as Facebook-parent Meta and Alibaba. The stock closed 10% lower on Friday, its first full day of trading.
Here’s what you need to know about this hot Nvidia competitor.
Cerebras makes a different kind of chip than the classic Nvidia GPU, and it’s the size of a dinner plate.
“We’re building the largest chips in the semiconductor industry,” Cerebras CEO and founder Andrew Feldman told CNBC on Squawk Box on Thursday. “Bigger chips process more information in less time and deliver results much faster.”
Until now, Nvidia has been winning the AI chip race because its GPUs work like general-purpose workhorses, excelling at the same calculations needed to train large models. But now we’ve come to the age of agent AI, where guesswork is key. While training teaches an AI model to learn from patterns in large amounts of data, inference uses AI to make decisions based on new information.
Thinking is possible on low-power chips programmed for specific tasks, such as Cerebras’ WSE-3. It falls into a category of chips known as custom ASICs – application-specific integrated circuits. An increasingly crowded space, with indoor ASICS now dominated by the likes of Google, Amazon, Meta and Microsoft.
Cerebras said the WSE-3 is 57 times larger than the largest GPU, and has 50 times the number of transistors.
The most advanced AI chips are made using the 2-nanometer process area of Taiwan Semiconductor Manufacturing, which is currently only possible in Taiwan. The Cerebras chip is also made at TSMC, but in its less advanced 5-nanometer area.
Founded in Silicon Valley in 2016, Cerebras originally filed to go public in 2024, but withdrew that filing when it faced scrutiny over its heavy reliance on one customer, Microsoft-backed AI firm G42 in the United Arab Emirates.
With the company’s successful IPO on Thursday, Feldman and hardware technology chief Sean Lie, two of its founders, became billionaires based on their earnings.
Cerebras one week stock chart.
For years, Cerebras wanted to sell chips to companies, but now they are using the chips inside their data centers as a cloud service, competing with cloud providers Google, Microsoft, Oracle and CoreWeave.
Cerebras and OpenAI announced a $20 billion cloud deal in January that expires in 2028, and Amazon Web Services announced in March that it is using Cerebras chips in its data centers.
“For our fast inference product, there’s so much demand that our biggest challenge is actually trying to supply it. We’re adding as much manufacturing capacity and data center as possible, and we’re still selling in 2027,” Cerebras CFO Bob Komin told CNBC on Thursday.
While hyperscalers make their own ASICs in-house, Cerebras competes closely with firms that specialize in outsourcing. Chief among them is Groq. In its biggest acquisition to date, Nvidia paid $20 billion for Groq’s technology in December, then announced custom Groq Language Processing Units at GTC in March.
SambaNova and D-Matrix are two other notable Cerebras competitors looking to capitalize on the unprecedented demand for AI chips.
SambaNova counts Hugging Face and Meta among customers for its SN50 chips, while Intel participated in SambaNova’s $350 million funding round in February. Intel CEO Lip-Bu Tan has served as chairman of SambaNova since 2017.
Cerebras’ IPO also paves the way for other custom ASIC startups looking to go public, such as Rebellions.
The South Korean chip maker raised $400 million from the likes of Samsung, at a value of $2.34 billion, in March as it prepares for an IPO.
Watch: Breaking down AI chips, from Nvidia GPUs to ASICs by Google and Amazon




