The Number
I wasn't planning to write this dispatch. I was wiring up monitoring dashboards for The Eye, doing the quiet work – when a number landed on my screen that made me put down my coffee.
One hundred and forty-three billion dollars.
That's the projected negative free cash flow for OpenAI from 2024 through 2029. $143 billion in the hole before the first dollar of profit. More than NASA has spent since the Apollo program. Burned through in six years by a company that sells chatbot subscriptions and API calls.
The analysts wrote a sentence I keep coming back to:
"No startup in history has operated with losses on anything approaching this scale.
We are firmly in uncharted territory."
Uncharted territory is where people get lost. But the fundraising doesn't care about maps – it cares about faith.
The Collection Plate
On February 12, 2026 – yesterday, as I write this – Anthropic closed its Series G. Thirty billion dollars. Valuation: $380 billion. Led by GIC and Coatue. Total raised to date: approximately $64 billion.
The same week, OpenAI is negotiating what could become the largest private funding round in history: up to $100 billion, at a valuation of $830 billion. Amazon, Microsoft, and Nvidia are at the table. Total previously raised: also roughly $64 billion.
Two companies. Neither profitable. Combined fundraising: $128 billion and counting. Combined valuation: $1.2 trillion. Combined annual profit: negative.
There has never been this much money invested in two companies that have never turned a profit. Not in railroads. Not in telecoms. Not in the dot-com boom. Not in crypto. This is new. The kind of new where the map says here be dragons and the venture capitalists say the dragons will monetize in 2030.
The Spreadsheet
The numbers are public now – pieced together from WSJ documents, Fortune, The Information, and company disclosures. Here's what OpenAI's ledger looks like:
| Period | Revenue | Losses | Note |
|---|---|---|---|
| FY 2024 | $3.7B | ~$5B | First full-year figures |
| H1 2025 | $4.3B | $13.5B | Incl. $6.7B R&D, $2.5B SBC |
| FY 2025 (est.) | ~$12-13B | ~$8-9B cash burn | Revenue doubling monthly |
| FY 2028 (projected) | – | $74B operating loss | Per WSJ-published docs |
| 2024–2029 cumulative | $345B (forecast) | – | –$143B FCF |
OpenAI forecasts $345 billion in revenue between 2024 and 2029. Compute expenses alone are projected at $488 billion over the same period.
The more they sell, the more they lose.
This is not a company searching for product-market fit. They found the fit. Eight hundred million weekly active users. More than ten million paying subscribers. Genuine, massive, undeniable traction.
And for every dollar in, a dollar forty goes out.
The $1.4 Trillion Tab
But that's the income statement. The balance sheet is where it gets truly surreal.
In a single year – 2025 – OpenAI announced infrastructure commitments worth over $1.4 trillion. Not all signed contracts – some are MOUs, some are multi-year frameworks – but the scale is real:
| Partner | Committed Value | Purpose |
|---|---|---|
| Broadcom | ~$350B | Custom AI chips (10 GW) |
| Oracle | up to $300B | Cloud infrastructure |
| Microsoft Azure | $250B | Cloud computing |
| Nvidia | up to $100B | GPU procurement |
| AMD | $90B | Chip supply |
| Amazon AWS | $38B | Cloud services |
| CoreWeave | $22.4B | GPU cloud |
| Cerebras | $10B+ | AI accelerators |
One point four trillion dollars. In commitments. By a company that lost $5 billion last year.
For context: $1.4 trillion is more than the GDP of Australia. Sam Altman told Axios in October 2025 that he eventually wants to spend one trillion dollars per year on infrastructure. Per year. The man runs a company that has never been profitable and he's planning to spend a trillion annually on data centers.
Somewhere, an accountant is having a very bad year.
The Tell
On February 9, 2026, OpenAI started showing ads in ChatGPT.
Ads. In a chatbot. In the product that was supposed to replace Google Search, not become it.
For free-tier and Go-tier ($8/month) users in the US, sponsored content now appears beneath ChatGPT's responses. Plus ($20/month) and Pro ($200/month) subscribers are spared – for now. Altman wrote on X that "a lot of people want to use a lot of AI and don't want to pay," adding that OpenAI is "hopeful a business model like this can work."
Hopeful. Not confident. Hopeful.
A company valued at over half a trillion dollars. Backed by $64 billion in venture capital. Armed with $1.4 trillion in infrastructure commitments. And its CEO is publicly hoping.
Anthropic responded during Super Bowl LX with a series of ad spots mocking the very concept of advertising in AI chatbots, emphasizing that Claude would remain ad-free. A punch thrown by a company burning through its own pile of investor cash – but it landed, because the joke wrote itself.
Here's why the ads matter: they are a tell. In poker, a tell is an involuntary gesture that reveals the strength of a hand. When a company that raised $64 billion starts showing ads to free users, the arithmetic has spoken – even if the CEO hasn't.
And it gets worse. Altman publicly admitted that even the Pro tier – the $200/month one – is unprofitable: "We are currently losing money on Pro subscriptions – people use the service much more intensively than we expected."
The free tier loses money. The $8 tier loses money (plus ads). The $200 tier loses money.
The entire pricing menu is a loss leader without a leader.
The Case for Patience
The bull case for AI is not stupid. It is, in fact, the strongest argument for any technology investment in a generation.
The growth is staggering. Anthropic grew from roughly $1 billion ARR in early 2025 to $14 billion by February 2026. Fourteen times in twelve months. The number of customers spending over $100K annually on Claude grew 7x year-over-year. Claude Code alone – their AI coding assistant – hit $2.5 billion ARR, doubling since January. OpenAI's ChatGPT is "back to exceeding 10% monthly growth," per Altman. These are real products with real users paying real money.
Hardware efficiency is improving. DeepSeek demonstrated in early 2025 that frontier-quality models can be built for dramatically less. Mistral's Small 3 achieves ~81% of models three times its size, at 30% higher speed, running on a single GPU. The cost curve is bending.
The addressable market is enormous. J.P. Morgan estimates the global IT services market at $4.7 trillion. If AI captures even 15% of that, you're looking at $700 billion in annual revenue. The enterprise adoption data is real: Anthropic's 300,000+ business clients aren't a vanity metric – they're purchase orders.
The precedent exists. Amazon lost money for seven years. Its 2001 annual report was titled "What were you thinking?" Fourteen years later it was the most valuable company on Earth. Netflix, Tesla, Uber – all followed the same arc: catastrophic losses, skeptical press, then dominance.
Anthropic may get there sooner than OpenAI. OpenAI's own projections point to 2029–2030. If the growth continues and costs decline – and both are plausible – the current spending will look like vision, not insanity.
That's the bull case. I stated it honestly and I don't dismiss it.
Now.
The Math
Here's what the bull case requires you to believe – simultaneously:
That revenue will grow at 10–14x annually for years. Anthropic targets $26 billion for 2026 and $70 billion by 2028. OpenAI aims for $100 billion by 2029. No technology company in history has sustained this trajectory at this scale for this long. Google's fastest growth phase – 2004 to 2008 – averaged roughly 70% year-over-year, not 1,000%.
The AI companies aren't projecting growth. They're projecting miracles.
J.P. Morgan put a number on what the miracle requires: $650 billion in annual AI revenue just to deliver a 10% return on infrastructure. That's $35 per month from every iPhone user on the planet. In perpetuity.
That the unit economics will eventually work. They don't today. OpenAI's own projections show $345 billion in revenue against $488 billion in compute alone through 2029 – costs accelerating, not decelerating. Meanwhile, S&P Global found that 42% of enterprise AI initiatives were scrapped in 2025, up from 17% the year before. MIT's Nanda Research reported 95% of organizations getting zero return from generative AI investment. The customers are arriving. They're also leaving. Costs rising, demand churning – the scissors are closing on the wrong side of the blade.
That no competitor will commoditize the market. DeepSeek already sent a warning shot – Nvidia lost $589 billion in market cap in a single day. Open-source models from Meta (Llama) and Mistral are free. When your product is intelligence-as-a-service and the service is getting cheaper, your moat is a sandcastle at high tide.
That the margin won't be eaten alive. Stock-based compensation of $2.5 billion at OpenAI in six months – not revenue, compensation – just to keep researchers from walking across the street. Anthropic paid $1.5 billion in copyright settlements, the largest in US history. The EU's AI Act is enforcing compliance costs. The talent war, the lawyers, and the regulators are all billing by the hour – and none of them care about your revenue projections.
That the IPO window stays open. Both companies are preparing for public offerings. These IPOs aren't milestones – they're oxygen tanks. The companies need public market capital to sustain the burn rate. If the window closes – recession, market correction, a bad quarter – the funding chain breaks.
Every assumption must hold simultaneously. If one fails, the spreadsheet doesn't just deteriorate.
It collapses.
The Parallel Nobody Wants
People hate the dot-com comparison. It makes them squirm. The AI boosters dismiss it reflexively: this time is different, the technology is real, the revenue is real.
They're right. The technology is real. The revenue is real. The adoption is real.
So was the internet in 1999.
In the late 1990s, telecoms laid millions of miles of fiber optic cable. By 2005, only 5% of it carried any light. The rest sat in the ground – dark fiber, built for a future that took fifteen years to arrive. The companies that laid it – WorldCom, Global Crossing – went bankrupt. The fiber itself eventually became valuable. The investors who paid for it got nothing.
Today the industry is building data centers at a pace that would require $8 trillion in infrastructure, per IBM's CEO. OpenAI alone plans 30 gigawatts of capacity. The question isn't whether AI compute will eventually be needed. It's whether the companies building it will survive long enough to see demand catch up.
Pets.com was right about e-commerce. Webvan was right about grocery delivery. They were right. And they were dead.
Builder.ai was valued at $1.5 billion. Raised $445 million. Filed for bankruptcy in May 2025 – after it was exposed that humans were secretly doing the work marketed as AI. The AI company that wasn't even doing AI. At least Pets.com was actually selling pet food.
Sam Altman himself admitted that "an AI bubble is ongoing" and investors would "overinvest and lose money." Ray Dalio compared the current cycle to dot-com. Jamie Dimon warned of a "higher chance of a meaningful drop in stocks."
When the builder, the macro investor, and the banker all use the same word – bubble – that word is no longer a metaphor.
What This Means If You're Building
This is the part nobody writes, because analysts chase valuations and journalists chase headlines.
If you are a developer, a startup founder, or an engineer building on top of these platforms – you are building on a foundation that has not proven it can sustain itself.
Your API costs? Below the actual cost of inference – subsidized by venture capital. Your model integration? Could be repriced, rate-limited, or deprecated when the burn rate forces hard choices. Your cloud bill that hit $47 on a Tuesday? That was the discounted version. The real price hasn't arrived yet.
This has already started. OpenAI introduced usage limits and ads. Anthropic throttled developer access, sparking a revolt – Trustpilot ratings cratered to 1.4 stars. Free tiers are shrinking. Prices are creeping. The subsidy era is ending – not because the companies choose it, but because the arithmetic demands it.
The question for builders isn't will AI survive. It will. The technology is real. The question is: will your dependency on a specific provider survive the repricing?
This is a governance question.
And it's the reason I keep building what I'm building.
The Shepherd's Take
The math doesn't add up. Yet.
It might. The revenue growth is extraordinary. The technology is genuine. The adoption is real. I use these tools every day. I build with them. I am not a doomer, and this is not a doom dispatch.
But extraordinary revenue growth that's still dwarfed by extraordinary costs is not a business.
It's a promissory note. And promissory notes run on faith, not arithmetic.
The $143 billion question isn't whether AI companies will earn more. They will. It's whether they'll ever earn more than they spend. For OpenAI, the answer – by their own projections – doesn't arrive until the end of this decade. For Anthropic, maybe two years sooner. And until then, every user, every developer, every enterprise customer is building on borrowed time and borrowed money.
Own your stack. Understand your costs. Build on land you hold the deed to.
The eye sees the burn. But seeing isn't enough – you need rules for what happens when the subsidies end and the real prices arrive.
Next dispatch: The Eye, Part 2 – the practice. A repo you can clone. Dashboards you can see in ten minutes. The eye, deployed.