OpenAI Just Shipped GPT-5.5. It's Also Quietly Missing Its Own Revenue Targets.

Two things can be true at once. OpenAI can be shipping the most capable AI models the world has seen, and it can also be falling short of the revenue and user numbers its own leadership set as targets. According to a Wall Street Journal report published today, both are true.
OpenAI missed its internal goal of reaching one billion weekly active ChatGPT users by the end of 2025. It missed multiple monthly revenue targets earlier this year. And the company's CFO Sarah Friar is reportedly at odds with CEO Sam Altman over whether the pace of infrastructure spending is sustainable given where revenue actually is.
The market didn't take it well. Oracle, which has a $300 billion five-year partnership to supply compute to OpenAI, dropped more than 6%. Nvidia, Broadcom, and AMD slid between 3% and 5%. The companies whose valuations are most tightly coupled to OpenAI's growth story took an immediate hit.
What the Numbers Actually Say
The details matter here. OpenAI didn't fail to grow. It grew substantially. It just didn't grow as fast as it told itself it would.
Missing a target of one billion weekly active users by the end of 2025 is notable for two reasons. First, because that's an extraordinarily ambitious target — one that very few products in history have hit at all, let alone on a set timeline. Second, because it reveals something about the assumptions baked into OpenAI's financial model. The company has committed to infrastructure spending that was predicated on user and revenue growth hitting specific numbers. If those numbers slip, the spending doesn't automatically adjust.
That's the tension Friar is reportedly raising internally. OpenAI has entered into large long-term contracts — with Oracle, with data center operators, with chip suppliers — based on projections that haven't fully materialised. The company is spending at a rate that requires the revenue growth to come. If it's slower than expected, the math gets uncomfortable.
Altman's position, as reported, is to keep pushing. The model is that scale drives adoption, adoption drives revenue, and pulling back on infrastructure now would cede ground to competitors at exactly the wrong moment.
Both positions make sense in isolation. The disagreement is about how much risk is acceptable, and who carries it if the growth doesn't arrive on schedule.
The Competitor Problem
The revenue miss isn't just a target-setting problem. There's a structural reason it's happening.
OpenAI has lost ground to Anthropic in coding and enterprise markets. That's a specific, measurable shift. Developers building AI-powered applications have been moving toward Claude — partly for capability reasons, partly because [Anthropic's custom silicon partnerships with Broadcom and Google](https://converzoy.com/insights/anthropic-broadcom-google-ai-chips-partnership) have given it infrastructure pricing advantages that matter at scale. Enterprise buyers are also diversifying. Locking into one AI provider has started to feel like the wrong strategy, and OpenAI is no longer the obvious default.
[Amazon's $33 billion Anthropic bet](https://converzoy.com/insights/amazon-anthropic-33-billion-deal) is partly a reflection of this. Amazon didn't commit that kind of capital to Anthropic because it was hedging. It did it because it sees Anthropic as a credible alternative at the enterprise tier — exactly the market OpenAI needs to own to justify its spending.
The irony of the timing is hard to ignore. OpenAI shipped GPT-5.5 last week at double the API price of its predecessor, framing it as a "new class of intelligence." That framing assumes enterprise buyers will pay the premium. If those buyers are already moving toward Anthropic or running their own models, the premium is harder to sustain.
What This Means for the AI Spending Narrative
The broader AI infrastructure investment story — the one that's driven Oracle's valuation, Nvidia's stock price, and billions in data center commitments — is predicated on AI revenue growing fast enough to justify the spend. Today's report is a small crack in that narrative.
One missed target at one company doesn't break the thesis. But OpenAI is not just any company. It's the company the entire narrative is built around. If OpenAI's growth is slower than projected, the assumptions embedded in the valuations of every company downstream of it deserve a second look.
We covered the [data center delay and power shortage problem](https://converzoy.com/insights/ai-data-center-delays-power-shortage-2026) earlier this year — the supply-side constraint on AI infrastructure. The demand-side constraint is newer, and less discussed. Today is the first clear signal that it exists.
OpenAI will almost certainly hit its targets eventually. The question is whether eventually is fast enough for the commitments it's already made.
You might also like

Google Just Split Its AI Chip Into Two. One for Training. One for Inference. That's a Bigger Deal Than It Sounds.

Meta Paid $2 Billion for Manus. China Is Ordering It Back.

ChatGPT Images 2.0 Thinks Before It Draws. DALL-E 3 Has Three Weeks Left.

OpenAI Shipped GPT-5.5 Six Weeks After GPT-5.4. The Release Cadence Is the Story.