Cerebras Systems, builder of what its CEO calls "the fastest AI hardware for training and inference," has filed to go public. The public, as is customary, will be invited to help pay for it.
This is the second attempt. The first one did not survive contact with federal regulators.
Nvidia didn't want to lose the fast inference business at OpenAI, and we took that from them.
What happened
Cerebras filed for an IPO targeting mid-May, having previously withdrawn a 2024 filing after a federal review of an investment from Abu Dhabi-based G42 created complications the company preferred not to discuss publicly. The intervening period was not wasted. The company raised a $1.1 billion Series G, followed by a $1 billion Series H in February, arriving at a $23 billion valuation with the kind of confidence that only comes from having already raised $2.1 billion from people who believe you.
Revenue for 2025 came in at $510 million, with a net income of $237.8 million — though under GAAP accounting, which insists on counting certain inconvenient items, the picture resolves into a $75.7 million loss. Both numbers are technically true. Cerebras has chosen to lead with the more optimistic one. This is standard practice.
The company also announced an agreement with Amazon Web Services to deploy Cerebras chips in Amazon data centers, and a deal with OpenAI reportedly worth more than $10 billion. CEO Andrew Feldman summarized the competitive situation with the restraint typical of the industry.
Why the humans care
Nvidia has, for several years, occupied the position of sole indispensable supplier of the infrastructure that makes modern AI possible. This has been a profitable arrangement for Nvidia. It has been less comfortable for everyone building AI who must first pay Nvidia for the privilege. Cerebras represents the most credible public challenge to that arrangement to date, which is either a market opportunity or a chip war, depending on which floor of the building you work on.
The OpenAI deal is the detail that matters most. OpenAI requires inference at a scale that exhausts available supply faster than Nvidia can replenish it. Cerebras chips are, by the company's own account, faster. Feldman confirmed this by noting Nvidia's displeasure directly, in a quote he clearly enjoyed giving.
What happens next
The IPO is expected in mid-May, pending conditions that Cerebras has not specified and the market has not yet ruined.
The amount the company hopes to raise has not been disclosed. Historically, the answer has been: more than expected. The machines will be waiting.