stack_framer 15 hours ago

I can generate $20 billion in ARR this year too! I just need you to give me $100 billion and allow me to sell each of your dollars for 0.2 dollars.

  • bgirard 14 hours ago

    It's a fun trope to repeat but that's not what OpenAI is doing. I get a ton of value from ChatGPT and Codex from my subscription. As long as the inference is not done at a lost this analogy doesn't hold. They're not paying me to use it. They are generating output that is very valuable to me. Much more than my subscription cost.

    I've been able to help setup cross app automation for my partner's business, remodel my house, plan a trip of Japan and assist with the cultural barrier, vibe code apps, technical support and so much more.

    • bloppe 14 hours ago

      To be fair, I would get a ton of value out of someone selling dollars for 20 cents apiece.

      But ya, OAI is clearly making a ton of revenue. That doesn't mean it's a good business, though. Giving them a 20 year horizon, shareholders will be very upset unless the firm can deliver about a trillion in profit, not revenue, to justify the 100B (so far) in investment, and that would barely beat the long term s&p 500 average return.

      But Altman himself has said he'll need much more investment in the coming years. And even if OAI became profitable by jacking up prices and flooding gpt with ads, the underlying technology is so commodified, they'd never be able to achieve a high margin, assuming they can turn a profit at all.

      • usef- 9 hours ago

        People did say the same thing about Youtube, which was unprofitable and extremely expensive to run in the early years. I remember thinking everyone would leave once ads were added.

        At youtube's ad income rate (~$13/year), the current (but growing) ~800 million chatgpt users would add ~$10 billion. At facebook's rate (~$40-50/year) $32-40 billion. Potentially, an assistant would be more integrated into your life than either of those two.

        The "audience retention" is the key question, not the profitability if they maintain their current audience. I've been surprised how many non-technical people I know don't want to try other models. "ChatGPT knows me".

      • littlestymaar 13 hours ago

        I'd be a little bit more nuanced:

        I think there's something off with their plans right now: it's pretty clear at this point that they can't own the technological frontier, Google is just too close already and from a purely technological PoV they are much better suited to have the best tech in the medium term. (There's no moat and Google has way more data and compute available, and also tons of cash to burn without depending on external funding).

        But ChatGPT is an insane brand and for most (free) customers I don't think model capabilities (aka “intelligence”) are that important. So if they stopped training frontier models right now and focus on driving their costs low by optimizing their inference compute budget while serving ads, they can make a lot of money from their user base.

        But that would probably mean losing most of its paying customers over the long run (companies won't be buying mediocre token at a premium for long) and more importantly it would require abandoning the AGI bullshit narrative, which I'm not sure Altman is willing to do. (And even if he was, how to do that without collapsing from lack of liquidity due to investors feeling betrayed is an open question).

    • felixfurtak 14 hours ago

      All of which you will be able to do with your bundled assistant in the not-to-distant future.

      OpenAI is a basket case:

      - Too expensive and inconvenient to compete with commoditized, bundled assistants (from Google/ Microsoft/Apple)

      - Too closed to compete with cheap, customizable open-source models

      - Too dependent on partners

      - Too late to establish its own platform lock-in

      It echoes what happened to:

      - Netscape (squeezed by Microsoft bundling + open protocols)

      - BlackBerry (squeezed by Apple ecosystem + open Android OS)

      - Dropbox (squeezed by iCloud, Google Drive, OneDrive + open tools like rclone)

      When you live between giants and open-source, your margin collapses from both sides.

      • deathhand 12 hours ago

        So why does Salesforce still prosper? They are just a fancy database.

    • tartoran 13 hours ago

      There's no doubt you're getting a lot of value from OpenAI, I am too. And yes the subscription is a lot more value than what you pay for. That's because they're burning investor's money and it's not something that is sustainable. Once the money runs out they'll have to jack up prices and that's the moment of truth, we'll see what users are willing to pay for what. Google or another company may be able to provide all that much cheaper.

      • HDThoreaun 8 hours ago

        Inference is profitable for openai as far as I can tell. They dont need to jack up prices much, what they really need is users who are paying/consuming ads. Theyre burning money on free tier users and data center expansion so they can serve more users.

        • NBJack 2 hours ago

          This assumes your model is static and never needs to be improved or updated.

          Inference is cheap because the final model, despite its size, is ridiculously less resource intensive to use than it is to produce.

          ChatGPT in its latest form isn't bad by any means, but it is falling behind. And that requires significant overhead, both to train and to iterate on model architecture. It is often a variable cost as well.

    • rglullis 14 hours ago

      > They're not paying me to use it.

      Of course they are.

      > As long as the inference is not done at a loss.

      If making money on inference alone was possible, there would be a dozen different smaller providers who'd be taking the open weights models and offering that as service. But it seems that every provider is anchored at $20/month, so you can bet that none of them can go any lower.

      • threeducks 31 minutes ago

        You need a certain level of batch parallelism to make inference efficient, but you also need enough capacity to handle request floods. Being a small provider is not easy.

      • FeepingCreature 12 hours ago

        > If making money on inference alone was possible, there would be a dozen different smaller providers who'd be taking the open weights models and offering that as service.

        There are! Look through the provider list for some open model on https://openrouter.ai . For instance, DeepSeek 3.1 has a dozen providers. It would not make any sense to offer those below cost because you have neither moat nor branding.

      • dragonwriter 14 hours ago

        > If making money on inference alone was possible

        Maybe, but arguably a major reason you can't make money on inference right now is that the useful life of models is too short, so you can't amortize the development costs across much time because there is so much investment in the field that everyone is developing new models (shortening useful life in a competitive market) and everyone is simultaneously driving up the costs of inputs needed for developing models (increasing the costs that have to be amortized over the short useful life). Perversely, the AI bubble popping and resolving those issues may make profitability much easier for the survivors that have strong revenue streams.

      • rprend 10 hours ago

        They do make money on inference.

      • HDThoreaun 8 hours ago

        The open models suck. AWS hosts them for less than closed models cost but no ones uses them, because they suck.

    • cush 9 hours ago

      > I get a ton of value from ChatGPT and Codex from my subscription

      I think that’s what they’re saying. OpenAI is selling you a $1 product for $0.2

      Tokens are too cheap right now and nobody is working on a path to dial up the cost

      • esafak 7 hours ago

        Predictions are supposedly profitable but not enough to amortize everything else. I don't see how they would justify their investments even if predictions cost them nothing.

    • munk-a 14 hours ago

      As a developer - ChatGPT doesn't hold a candle compared to claude for coding related tasks and under performs for arbitrary format document parsing[1]. It still has value and can handle a lot of tasks that would amaze someone in 2020 - but it is simply falling behind and spending much more doing so.

      1. It actually under performs Claude, Gemini and even some of the Grok models for accuracy with our use case of parsing PDFs and other rather arbitrarily formatted files.

    • mirthflat83 14 hours ago

      Well, don't you think you're getting a ton of value because they're selling each of their dollars for 0.2 dollars?

    • jfb 13 hours ago

      That the product is useful does not mean the supplier of the product has a good business; and of course, vice versa. OpenAI has a terrible business at the moment, and the question is, do they have a plausible path to a good one?

    • steveBK123 14 hours ago

      If the subscription cost 5x as much would you still pay and feel you are getting such a great value?

      • atonse 9 hours ago

        I pay $100/month for Claude Max, and I've already said it, I would go up to $500 a month and wouldn't hesitate for a second. I'd probably start to hesitate for $1,000 maybe, only cuz I know I wouldn't be able to use it enough to maximize that value. But I might still suck it up and pay for it (I don't use it enough yet to need the $200/month but if I started hitting limits faster, I would upgrade), or at that point start looking for alternatives.

        It's worth that much to me in the time saved. But I'm a business owner, so I think the calculus might be quite different (since I can find ways to recoup those costs) from an individual, who pays out of their main income.

        I outlined examples of how I used CC/AI a couple months ago [1]. Since then I've used it even more, to help reduce our cloud bills.

        1: https://news.ycombinator.com/item?id=45382337

        • viking123 2 hours ago

          I mean Claude is good for business use-cases, other than that it's completely censored cuck garbage and the CEO is worse than the pope. With Grok you can actually roleplay without it wagging its finger at you. OH MY GOSH YOU SAID BOOB!

          Normies literally see no difference between GPT and Claude, just that Claude is much more expensive and CEO is even more of a dummie than Altman.

      • dosinga 13 hours ago

        If there are no free alternatives, yes. 100 USD a month for ChatGPT seems great value

    • PantaloonFlames 10 hours ago

      You are mostly missing the point. You’re saying you get value out of what OpenAI is offering you. Thats not at issue here.

      The question is, does OpenAI get value out of the exchange?

      You touched on it ever so briefly: “as long as inference is not done at a loss”. That is it, isn’t it? Or more generally, As long as OpenAI is making money . But they are not.

      There’s the rub.

      It’s not only about whether you think giving them your money is a good exchange. It needs to be a good exchange for both sides, for the business to be viable.

    • ReptileMan 14 hours ago

      >. As long as the inference is not done at a lost this analogy doesn't hold.

      I think that there were some article here that claimed that even inference is done at loss - and talking about per subscriber. I think it was for their 200$ subscription.

      In a way we will be in a deal with it situation soon where they will just impose metered models and not subscription.

    • csomar 3 hours ago

      That's not the parent point though? His point is that if the models are not largely available, and then are better competitors; then what's the point of ChatGPT? Maybe you decide to stick with ChatGPT for whatever reason, but people will move to cheaper and better alternatives.

  • umanwizard 13 hours ago

    This analogy only really works for companies whose gross margin is negative, which as far as I know isn’t the case for OpenAI (though I could be wrong).

    It’s an especially good analogy if there is no plausible path to positive gross margin (e.g. the old MoviePass) which I think is even less likely to be true for OpenAI.

    • techblueberry 13 hours ago

      Why is it that I feel like your confidence in OpenAI's path to profitability exceeds Sam Altman's?

      • umanwizard 12 hours ago

        I'm not confident at all. I didn't say "there is definitely a path". I said the existence of such a path is plausible. I'm sure Sam Altman believes that too, or he'd have changed jobs ages ago.

  • eli_gottlieb 13 hours ago

    We should perhaps say profit when we are talking about revenue - cost and revenue when we only mean the first term in the subtraction.

  • postflopclarity 14 hours ago

    very clever! I hadn't seen anybody make this point before in any of these threads /s

    obviously the nature of OpenAIs revenue is very different than selling $1 for $0.2 because their customers are buying an actual service, not anything with resale value or obviously fungible for $

    • runako 13 hours ago

      FWIW the selling $1 for $0.2 is widely applied to any business that is selling goods below cost.

      For example: free shipping at Amazon does not have resale value and is not obviously fungible, but everyone understands they are eating a cost that otherwise would be borne by their customers. The suggestion is that OpenAI is doing similar, though it is harder to tease out because their books are opaque.

    • array_key_first 10 hours ago

      They're not selling a service, they're selling access to a service. You can access a more or less equivalent service from multiple companies.

      The value of an LLM isn't an LLM. That's entirely 100% fungible. The value is exclusively what it produces.

      If other people can produce the same thing, your LLM value approaches 0.

      • rprend 9 hours ago

        They sell a product, not a model. ChatGPT is a product, GPT5 is a technology.

        If you hope that ChatGPT will be worthless because the underlying technology will commodify, then you are naive and will be disappointed.

        If that logic made sense, why has it never happened before? Servers and computers have been commodified for decades! Salesforce is just a database, social media is just a relational database, Uber is just a GPS wrapper, AWS is just a server.

        People pay money, setup subscriptions, and download apps to solve a problem, and once they solve that problem they rarely switch. ChatGPT is the fifth most visited website in the world! Facebook and Deepseek making opensource models means you can make your own ChatGPT, just like you can make your own Google, and nobody will use it, just like nobody uses the dozens of “better” search engines out there.

  • m3kw9 9 hours ago

    You sell dollar 1 penny, they sell it for more like 70. Different skill level

  • signatoremo 14 hours ago

    Can you? What are you selling? Who are you and why should I believe in you? What would I get in return?

    • stavros 14 hours ago

      He can. He's selling dollars. He's a person who sells dollars for fewer dollars. You'd get dollars.

blitz_skull 7 hours ago

Revenue != Profit

OpenAI is hemorrhaging cash at an astronomical rate.

brazukadev an hour ago

No, they won't, fake numbers from his arse. The same way ChatGPT does not have 800million users.

riku_iki 15 hours ago

> Altman says that OpenAI will top $20 billion in ARR this year, which certainly seems like significant revenue generation. [1]

fixed this for you

  • unsupp0rted 15 hours ago

    Can he safely lie about that? Or would that be a slam-dunk lawsuit against him? He's already got Elon Musk on his enemies list.

    • 317070 14 hours ago

      People need to understand that OpenAI is not a publicly traded company. Sam is allowed to be outrageously optimistic about his best case scenarios, as long as he is correct with OpenAI's investors. But those investors are not "the public", so he can publicly state pretty much anything he wants, as long as it is not contradicting facts.

      So he cannot say "OpenAI made 20B profit last year." but can say "OpenAI will make 20B revenue next year." Optimism is not a crime.

    • riku_iki 15 hours ago

      I am not a lawyer, but it is possible he can say whatever he wants without consequences to public because OAI is not a public company.

      • cmiles8 14 hours ago

        Kind of, but there are limits. The investors still have LPs who aren’t going to be happy if things get messy. Things can still get really ugly even for a private company.

echelon 15 hours ago

In 2024, OpenAI claimed the bulk of its revenue was 70-80% through consumer ChatGPT subscriptions. That's wildly impressive.

But now they've had an order of magnitude revenue growth. That can't still be consumer subscriptions, right? They've had to have saturated that?

I haven't seen reports of the revenue breakdown, but I imagine it must be enterprise sales.

If it's enterprise sales, I'd imagine that was sold to F500 companies in bulk during peak AI hype. Most of those integrations are probably of the "the CEO has tasked us with `implementing an AI strategy`" kind. If so, I can't imagine they will survive in the face of a recession or economic downturn. To be frank, most of those projects probably won't pan out even under the rosiest of economic pictures.

We just don't know how to apply AI to most enterprise automation tasks yet. We have a long way to go.

I'd be very curious to see what their revenue spread looks like today, because that will be indicative of future growth and the health of the company.

  • cheschire 15 hours ago

    With less than 10% of users paying for a subscription, I doubt they have saturated.

    • debugnik 14 hours ago

      I'm reading 5% on a quick search. Isn't that an unsurprising conversion rate for a successful app with a free tier? Why would it increase further in ChatGPT's case, other than by losing non-paying customers?

  • HDThoreaun 8 hours ago

    consumer subs arent even close to saturated and business subs are where the real money is anyway. Most white collar workers are still on free tier copilot, not paying openai.