The Numbers Are Out

Confidential financial documents from OpenAI and Anthropic have found their way into a Wall Street Journal report, and the figures inside are staggering. OpenAI, the company behind ChatGPT, projects it will burn through $85 billion in cash during 2028 alone before reaching profitability around 2030. Anthropic, its closest competitor, tells a similar story with smaller numbers: peak losses this year, a path to green by 2029.

Both companies shared these projections with investors during recent funding rounds. Neither has filed official IPO paperwork yet. But the sheer scale of the spending tells the real story of what it costs to compete at the frontier of artificial intelligence.

$121 Billion on Computers. One Year.

The single most striking figure in the entire report is OpenAI’s compute budget for 2028: $121 billion. That is not revenue. That is not total expenses. That is just the cost of the computing power needed to train and run AI models. For context, OpenAI expects to spend just over $25 billion on training in 2026. That number nearly quintuples in two years.

The projection for 2029 ticks slightly higher before dropping back below $100 billion in 2030. The implicit assumption is that training methods get more efficient, or that the models simply do not need to keep growing at the same pace. That is a big bet.

To put $121 billion in perspective: Amazon spent roughly $77 billion on all capital expenditures in 2025, including its entire logistics network, data centers, and content production. OpenAI’s compute budget alone would exceed that.

Anthropic: Same Direction, Different Speed

Anthropic’s financial picture mirrors OpenAI’s trajectory, just on a smaller scale. The company expects 2026 to be its worst year for losses, then gradual improvement toward profitability in 2028 and consistent profits by 2029.

Anthropic’s compute spending surpasses $30 billion in 2029. Revenue projections approach $150 billion by 2029, driven heavily by enterprise sales through cloud partners like Amazon Web Services and Google Cloud. That channel strategy differs from OpenAI, which has leaned harder into consumer products.

Both companies face the same basic question: can revenue growth keep pace with the compounding cost of training ever-larger models? The financials say they think so. But the gap between projection and reality is where IPO investors will make or lose their money.

The Revenue Side of the Equation

The spending only makes sense if revenue scales accordingly. OpenAI projects its revenue will nearly double each year, reaching roughly $275 billion by 2030. Of that, about $150 billion would come from consumer products, including paid ChatGPT subscriptions and whatever advertising or premium services the company builds.

That consumer number is ambitious. For comparison, Apple’s entire Services division, which includes the App Store, Apple Music, iCloud, and Apple TV+, generated about $96 billion in fiscal 2025. OpenAI expects its consumer AI products alone to outpace that within five years.

Anthropic’s revenue path runs more through enterprise. The company projects strong growth from corporate customers who embed Claude into workflows, customer service, code generation, and document analysis. Cloud partnerships mean Anthropic does not have to build the distribution itself. It piggybacks on infrastructure that already exists inside big companies.

Why This Matters for Everyone Else

These numbers do not just affect OpenAI and Anthropic investors. They ripple outward across the entire technology industry.

Chipmakers like Nvidia benefit directly from every dollar of compute spending. Rising GPU rental costs, confirmed by Silicon Data’s market indexes, show no signs of easing. The usual post-launch price drops are not happening. Demand is too strong.

Startups building on top of these models face a strategic calculation. If the foundational model companies are spending this much, pricing for API access will eventually need to reflect that cost. Cheap inference today might not last. Companies that built their entire business model on $2-per-million-token pricing could face a rude shock.

Enterprise buyers should pay attention too. The AI platforms they commit to in 2026 might have very different economics by 2028. Lock-in is real, and switching costs are high. Understanding the financial health of the model provider is now part of the procurement process.

The IPO Question

Both companies are moving toward public listings. OpenAI is reportedly targeting a 2026 IPO. Anthropic’s timeline is less clear but likely within the same window.

The challenge for both is straightforward: how do you sell a company to public market investors when the core business plan involves losing tens of billions of dollars per year for the rest of the decade? The answer is growth. If revenue actually does double annually, the story writes itself. But if growth stalls, or if compute costs run even higher than projected, the math gets ugly fast.

Public market investors have seen this movie before. Uber lost money for over a decade before turning profitable. Amazon ran at a loss for years. The difference is scale. OpenAI’s projected 2028 cash burn of $85 billion is larger than the GDP of many countries. There is no historical precedent for a company losing that much money and surviving, let alone thriving.

Then again, there was no precedent for a company reaching 100 million users in two months before ChatGPT either.

What to Watch

The real signal will come in late 2026 and early 2027. If OpenAI’s revenue growth holds pace and enterprise adoption of Anthropic’s Claude continues to accelerate, the spending projections start to look reasonable. If either company misses a quarter, the narrative shifts fast.

GPU pricing remains the key variable. Any breakthrough in training efficiency, whether through better algorithms, custom silicon, or alternative architectures, could significantly alter the cost trajectory. Both companies are investing in this area, but so far, no one has cracked the code on making frontier training cheap.

The AI industry has operated on faith for three years: faith that models will keep improving, faith that customers will keep paying, faith that the economics will eventually work. The Wall Street Journal report is the first detailed look at exactly what that faith costs. The number is $121 billion a year in compute alone. Whether that faith is rewarded or punished will define the next decade of technology.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Sign In

Register

Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.