OpenAI, a non-profit AI company that will lose anywhere from $4 billion to $5 billion this year, will at some point in the next six or so months convert into a for-profit AI company, at which point it will continue to lose money in exactly the same way. Shortly after this news broke, Chief Technology Officer Mira Murati resigned, followed by Chief Research Officer Bob McGrew and VP of Research, Post Training Barret Zoph, leaving OpenAI with exactly three of its eleven cofounders remaining.
This coincides suspiciously with OpenAI's increasingly-absurd fundraising efforts, where (as I predicted in late July) OpenAI has raised the largest venture-backed fundraise of all time $6.6 billion— at a valuation of $157 billion.
EDITOR'S NOTE:
Yet despite the high likelihood of the round's success, there are quite a few things to be worried about. The Wall Street Journal reported last week that Apple is no longer in talks to join the round, and while one can only speculate about its reasoning, it's fair to assume that Apple (AAPL), on signing a non-disclosure agreement, was able to see exactly what OpenAI had (or had not) got behind the curtain, as well as its likely-grim financial picture, and decided to walk away. Nevertheless, both NVIDIA (NVDA) and Microsoft (MSFT) are both investing, with Microsoft, according to the Wall Street Journal, pushing another $1 billion into the company — though it's unclear whether that's in real money or in "cloud credits" that allow OpenAI to continue using Microsoft's cloud.
Yet arguably the most worrying sign is that SoftBank's Vision Fund will be investing $500 million in OpenAI. While it might seem a little weird to be worried about a half-billion dollar check, SoftBank — best-known for sinking $16 billion or more into WeWork and getting swindled by its founder, and dumping a further €900m into Wirecard, which turned out to be an outright fraud, with one founder now a fugitive from justice in Russia — is some of the dumbest money in the market, and a sign that any company taking it is likely a little desperate.
While SoftBank has had a number of hits — NVIDIA, ARM and Alibaba, to name a few — it is famous for piling cash into terrible businesses, like Katerra (a construction company that died despite a $2 billion investment in 2021) and Zume Pizza (a robotic pizza company with a product that never worked that closed after raising more than $400 million, with $375 million coming from SoftBank).
No, really, the SoftBank Vision Fund isin a bad way.
Last year SoftBank's Vision Fund posted a record loss of $32 billion
a "stricter selection of investments."
One might think that three years of straight losses would humble Son
"born to realize artificial superintelligence," adding that he was "super serious about it.
while also suggesting that Jews had settled in Osaka 1000 years ago, making the people there "craftier," a comment that McDonald's had to issue a public apology for
In any case, OpenAI will likely prevail and raise this round from a cadre of investors that will have to invest a minimum of $250 million to put money behind a company that has never turned a profit, that has no path to profitability, and has yet to create a truly meaningful product outside of Sam Altman's marketing expertise. This round is a farce — a group delusion, one borne of one man's uncanny ability to convince clueless idiots that he has some unique insight, despite the fact that all signs point to him knowing about as much as they do, allowing him to prop up an unsustainable, unprofitable and directionless blob of a company as a means of getting billions of dollars of equity in the company — and no, I don't care what he says to the contrary.
Last week, the New York Times reported that OpenAI would lose $5 billion in 2024 (which The Information had estimated back in July), and that the company expected to raise the price of ChatGPT's premium product to $44-a-month over the next five years, and intended to increase the price of ChatGPT to $22-a-month by the end of 2024, a pale horse I've warned you of in the past.
Interestingly (and worryingly), the article also confirms another hypothesis of mine — that "fund-raising material also signaled that OpenAI would need to continue raising money over the next year because its expenses grew in tandem with the number of people using its products" - in simpler terms, that OpenAI will likely raise $6.5 billion in funding, and then have to do so again in short order, likely in perpetuity.
The Times also reports that OpenAI is making estimates that I would describe as "fucking ridiculous." OpenAI's monthly revenue hit $300 million in August, and the company expects to make $3.7 billion in revenue this year (the company will, as mentioned, lose $5 billion anyway), yet the company says that it expects to make $11.6 billion in 2025 and $100 billion by 2029, a statement so egregious that I am surprised it's not some kind of financial crime to say it out loud.
For some context, Microsoft makes about $250 billion a year, Google about $300 billion a year, and Apple about $400 billion a year.
To be abundantly clear, as it stands, OpenAI currently spends $2.35 to make $1.
OpenAI loses money every single time that somebody uses their product, and while it might make money selling premium subscriptions, I severely doubt it’s turning a profit on these customers, and certainly losing money on any and all power users. As I've said before, I believe there's also a subprime AI crisis brewing because OpenAI's API services — which lets people integrate its various models into external products — is currently priced at a loss, and increasing prices will likely make this product unsustainable for many businesses currently relying on these discounted rates.
As I've said before, OpenAI is unprofitable, unsustainable and untenable in its current form, but I think it's important to explain exactly how untenable it is, and I'm going to start with a few statements:
- For OpenAI to hit $11.6 billion of revenue by the end of 2025, it will have to more than triple its revenue.
- At the current cost of revenue, it will cost OpenAI more than $27 billion to hit that revenue target. Even if it somehow halves its costs, OpenAI will still lose $2 billion.
- However, OpenAI's costs are likely to increase, because (as the New York Times notes) if this company grows by 300%, it's very likely that the free user base of ChatGPT increases along with it, burdening the company with more costs.
- Even a $2 price increase (the first expected price hike for ChatGPT Plus, as the company reportedly works towards charging $44 per month) and similar price hikes on the Teams and Enterprise plans won’t do much to stem the flow of red ink on its balance sheet.
- GPT-4 — and this isn't inclusive of GPT-4o — cost $100 million to train, and more complex future models will cost hundreds of millions or even a billion dollars to train. The Information also estimated back in July that OpenAI's training costs would balloon to $3 billion in 2024.
- OpenAI has not had anything truly important since the launch of GPT-3.5, and its recent o-1 model has not been particularly impressive. It's also going to be much, much more expensive to run, as the "chain-of-thought" "reasoning" that it does requires a bunch of extra calculations (an indeterminate amount that OpenAI is deliberately hiding), and OpenAI can't even seem to come up with a meaningful use case.
- OpenAI's products are increasingly-commoditized, with Google, Meta, Amazon and even Microsoft building generative AI models to compete. Worse-still, these models are all using effectively-identical training data (and they're running out!), which makes their outputs (and by extension their underlying technology) increasingly similar.
- OpenAI's cloud business — meaning other companies connecting their services to OpenAI's API — is remarkably small, to the point that it suggests there's weaknesses in the generative AI industry writ large. It’s extremely worrying that the biggest player in the game only makes $1 billion (less than 30% of its revenue) from providing access to their models.
And, fundamentally, I can find no compelling evidence that suggests that OpenAI will be able to sustain this growth. In fact, I can find no historical comparison, and believe that OpenAI's growth is already stumbling.
Let's take a look, shall we?
How Does OpenAI Make Money?
To do this right, we have to lay out exactly how OpenAI makes money.
According to the New York Times, OpenAI expects ChatGPT to make about $2.7 billion in revenue in 2024, with an additional $1 billion coming from "other businesses using its technology."
Let's break this down.
ChatGPT Plus, Teams, and Enterprise — 73% of revenue (approximately $2.7 billion).
- OpenAI sells access to ChatGPT Plus to consumers for $20 a month, offering faster response times, "priority access to new features," and 24/7 access to OpenAI's models, with "5x more messages for GPT-4o," access image generation, data analysis and web browsing. Importantly, OpenAI can use anything you do as training data, unless you explicitly opt-out.
- OpenAI sells access to a "Teams" version of ChatGPT Plus, a self-service product that allows you to share chatbots between team users, costing $25-a-user-a-month if paid annually (so $300 a year per-user), and $30-a-user-a-month if paid monthly. From this point on, your data is excluded from that used to train OpenAI's models by default.
- OpenAI sells "enterprise" subscriptions that include an expanded context window for longer prompts (meaning you can give more detailed instructions), admin controls, and "enhanced support and ongoing account management."
- It isn't clear how much this costs, but a Reddit thread from a year ago suggests it's $60-a-user-a-month, with a minimum of 150 seats on an annual contract.
- I don’t know for certain, but it’s likely OpenAI offers some kind of bulk discount for large customers that buy in volume, as is the case with pretty much every enterprise SaaS business. I’ll explain my reasoning later in this piece.
- Assuming this is the case, that’s bad for OpenAI, as generative AI isn’t like any other SaaS product. Economies of scale don’t really work here, as servicing each user has its own cost (namely, the cloud computing power used to answer queries). That cost-per-user doesn’t decrease as you add more customers. You need more servers. More GPUs.
- Cutting prices, therefore, only serves to slash whatever meager margins exist on those customers, or to turn those potentially-profitable customers into a loss center.
Licensing Access To Models And Services — 27% of revenue (approximately $1 billion).
- OpenAI makes the rest of its money by licensing access to its models and services via its API. One thing you notice, when looking at its pricing page, is the variety of models and APIs available, and the variation in pricing that exists.
- OpenAI offers a lot of options: its most powerful GPT-4o model; the less-powerful-yet-cheaper GPT-4o-mini model; the "reasoning" model o1 (and its "mini" counterpart); a "text embeddings" API that is used primarily for tasks where you want to identify anomalies or relationships in text, or classify stuff in text; an "assistants API" for building assistants into an application (which in turn connect to one of the other models, which includes things like interpreting code or searching for files); three different image generation models; three different audio models; and a bunch of older legacy APIs and models.
- In many cases, customers can get a 50% discount by using the Batch API. This delays completion by as much as 24 hours and requires all tasks to be submitted in one batch, rather than as-and-when. This might be useful for using GPT to dig through masses of data.
- For example, when using the batch API, the cost of using GPT-4o drops from $5 per 1m input tokens to $2.5, and from $15 per 1m output tokens to $7.5.
- Batch pricing is not available for o1-preview.
- Additionally, this discount is not available when buying training tokens for fine-tuning models (although you still get the same discount for input and output tokens).
- Batch pricing is not available for DALL-E, the Assistants API, or the audio models.
- It’s also not available for GPT-3.5-turbo-instruct and the latest 4-o model.
- The pricing of these products gets a little messy, much like it does with basically every cloud company.
- OpenAI also makes around $200 million a year selling access to its models through Microsoft, according to Bloomberg.
- In conclusion, this means that OpenAI makes roughly $800 million a year by directly selling access to their API, with a further $200m coming from an external channel.
As a result of these numbers, I have major concerns about the viability of OpenAI's business, and the generative AI market at large. If OpenAI — the most prominent name in all of generative AI — is only making a billion dollars a year from this, what does that say about the larger growth trajectory of this company, or actual usage of generative AI products?
I'll get to that in a bit.
First, we've gotta talk about the dollars.
The Revenue Problem
So, as it stands, OpenAI makes the majority — more than 70% — of its revenue from selling premium access to ChatGPT.
A few weeks ago, The Information reported that ChatGPT Plus had "more than 10 million paying subscribers," and that it had 1 million more that were paying for "higher-priced plans for business teams." As I've laid out above, this means that OpenAI is making about $200 million a month from consumer subscribers, but "business teams" is an indeterminate split between teams ($25-a-user-a-month paid annually) and enterprise (at least $60-a-user-a-month, paid annually, with a minimum of 150 seats).
One important detail: 100,000 of the 1 million business customers are workers at management consultancy PwC, which has also become OpenAI's "first partner for selling enterprise offerings to other businesses." It isn't clear whether these are enterprise accounts or teams accounts, or whether PwC is paying full price (I'd wager it isn’t).
Here’s how this would play out in revenue terms across several assumed divisions of the customer base, and an assumption that every Teams customer is paying $27.5 (that plan costs either $25 or $30, depending on whether you pay monthly or yearly, but for the sake of fairness, I went with the middle ground). From there, we can run some hypothetical monthly revenue numbers based on a million "higher-priced plans for business teams."
- 25% Enterprise, 75% Teams: $35,625,000
- 50% Enterprise, 50% Teams: $43,750,000
- 75% Enterprise, 25% Teams: $51,875,000
Sadly, I don't think things are that good, and I honestly don't think these would be particularly-impressive numbers to begin with.
We can actually make a more-precise estimate by working backwards from the New York Times' estimates. ChatGPT Plus has 10 million customers, making OpenAI around $2.4 billion dollars a year (ten million users spending $20 each month equates to $200 million. Multiply that by 12, you get $2.4 billion). This means that business users make up about $300 million a year in revenue, or $25 million a month.
That is, to be frank, extremely bad. These are estimates, but even if they were doubled, these would not be particularly exciting numbers.
For all the excitement about OpenAI's revenue — putting aside the fact that it spends $2.35 to make $1 — the majority of the money it makes is from subscriptions to ChatGPT Plus for consumers, though one can fairly say there are professionals that use it under the consumer version too.
While 10 million paying subscribers might seem like a lot, "ChatGPT" is effectively to generative AI what "Google" is to search. Ten million people paying for this is table stakes.
OpenAI has been covered by effectively every single media outlet, is mentioned in almost every single conversation about AI (even when it's not about generative AI!), and has the backing and marketing push of Microsoft and the entirety of Silicon Valley behind it. ChatGPT has over 200 million weekly users, and the New York Times reports that OpenAI has "350 million people use [OpenAI's] services each month as of June" (though it's unclear if that includes those using the API). Collectively, this means that OpenAI — the most popular company in the industry — can only convert about 3% of its users.
This might be because it's not obvious why anyone should pay for a premium subscription. Paying for ChatGPT Plus doesn't dramatically change the product, nor does it offer a particularly-compelling new use case for anyone other than power users. As a company, OpenAI is flat-out terrible at product. While it may be able to attract hundreds of millions of people to dick around with ChatGPT (losing money with every prompt), it's hard to convert them because you have to, on some level, show the user what ChatGPT can do to get them to pay for it… and there isn't really much you can charge for, other than limiting how many times they can use it.
And, if we're honest, it still isn't obvious why anyone should use ChatGPT in the first place, other than the fact everybody is talking about it. You can ask it to generate something — a picture, a few paragraphs, perhaps a question — and at that point say "cool" and move on. I can absolutely see how there are people who regularly use ChatGPT's natural language prompts to answer questions that they can't quite phrase (a word that's on the tip of their tongue, a question they're not sure how to phrase, or to brainstorm something) but beyond that, there really is no "sticky" part of this product beyond "a search engine that talks back to you."
That product is extremely commoditized. The free version of ChatGPT is effectively identical to the free version of Anthropic's Claude, Meta's AI assistant, Microsoft's Copilot, and even Twitter's "Grok." They all use similar training data, all give similar outputs, and are all free. Why would you pay for ChatGPT Plus when Meta or Microsoft will give you their own spin on the same flavor? Other than pure brand recognition, what is it that ChatGPT does that Copilot (powered by ChatGPT) doesn't? And does that matter to the average user?
I'd argue it doesn't. I'd also argue that those willing to pay for a "Plus" subscription are more likely to use the platform way, way more than free users, which in turn may (as one Redditor hypothesized regarding Anthropic's "Claude Pro" subscription) lose it the revenue on said premium subscriber. While there's a chance that OpenAI could have a chunk of users that aren't particularly active, one cannot run a business based on selling stuff you hope that people won't use.
A note on “free” products:different
I’ll touch on customer churn later, but one more note about ChatGPT Plus users: as with any other consumer-centric subscription product, these customers are far more likely to cut their spending when they no longer feel like they’re getting value from their product, or when their household budgets demand it. Netflix — the biggest name in streaming — lost a million customers in 2022, around the time of the cost-of-living crisis (and, from 2025, it plans to stop reporting subscriber numbers altogether)
ChatGPT Plus is likely, for many people, a “lifestyle product.” And the problem is that, when people lose their jobs or inflation hikes, these products are the first to get slashed from the household budget.
OpenAI also has a unique problem that makes it entirely different to most SaaS solutions — the cost of delivering the solution. While 3% conversion of free customers to paying customers might regularly be on the low side of "good," said solutions are nowhere near as expensive as running software using generative AI.
There's also another wrinkle.
If the majority of OpenAI's revenue — over 70% — comes from people paying for ChatGPT Plus, then that heavily suggests the majority of its compute costs come from what is arguably its least-profitable product. The only alternative is that OpenAI's compute costs are so high that, despite making two-thirds of its revenue, ChatGPT creates so much overhead that it sours the rest of the business.
You see, ChatGPT Plus is not a great business. It's remarkable that OpenAI found 10 million people to pay for it, but how do you grow that to 20 million, or 40 million?
These aren't idle questions, either. At present, OpenAI makes $225 million a month — $2.7 billion a year — by selling premium subscriptions to ChatGPT. To hit a revenue target of $11.6 billion in 2025, OpenAI would need to increase revenue from ChatGPT customers by 310%.
If we consider the current ratio of Plus subscriptions to Teams and Enterprise subscriptions — about 88.89% to 11.11% — OpenAI would need to find 18.29 million paying users (assuming a price increase of $2 a month), while also retaining every single one of its current ChatGPT Plus users at a new price point, for a total $7.4 billion, or $616 million or so a month. It would also have to make $933 million in revenue from its business or enterprise clients, which, again, would require OpenAI to more-than-triple their current users.
OpenAI's primary revenue source is one of the most easily-commoditized things in the world — a Large Language Model in a web browser — and its competitor is Mark Zuckerberg, a petty king with a huge warchest that can never, ever be fired, even with significant investor pressure. Even if that wasn't the case, the premium product that OpenAI sells is far from endearing, still looking for a killer app a year-and-a-half into its existence, with its biggest competitor being the free version of ChatGPT.
There are ways that OpenAI could potentially turn this around, but even a battalion of experienced salespeople will still need paying, and will have the immediate job of "increase revenue by 300%" for a product that most people have trouble explaining.
No, really. What is ChatGPT? Can you give me an answer that actually explains what the product does? What is the compelling use case that makes this a must-have?
I am hammering this point because this is the majority of OpenAI's revenue. OpenAI lives and dies on the revenue gained from ChatGPT, a product that hasn't meaningfully changed since it launched beyond adding new models that do, for most users, exactly the same thing. While some might find ChatGPT's voice mode interesting, "interesting" just isn't good enough today.
And to drill down further, the majority of OpenAI's revenue is from ChatGPT Plus, not its Enterprise or Teams product, meaning that hiring a sales team is far from practical. How do you sell this to consumers, or professionals? Even Microsoft, which has a vast marketing apparatus and deep pockets, struggled to sell Copilot — which is based on OpenAI’s GPT models — on its weird (and presumably expensive) Superbowl ads, or on the countless commercials that dotted the 2024 Olympic Games.
To triple users, ChatGPT must meaningfully change, and do so immediately, or disclose multiple meaningful, powerful use cases that are so impressive that 18 million new people agree to pay $22 a month. That is an incredible — and some might say insane — goal, and one that I do not think this company is capable of achieving.
Yet this is far from the most worrying part of the current OpenAI story.