Comment by tartoran

Comment by tartoran 14 hours ago

3 replies

There's no doubt you're getting a lot of value from OpenAI, I am too. And yes the subscription is a lot more value than what you pay for. That's because they're burning investor's money and it's not something that is sustainable. Once the money runs out they'll have to jack up prices and that's the moment of truth, we'll see what users are willing to pay for what. Google or another company may be able to provide all that much cheaper.

HDThoreaun 8 hours ago

Inference is profitable for openai as far as I can tell. They dont need to jack up prices much, what they really need is users who are paying/consuming ads. Theyre burning money on free tier users and data center expansion so they can serve more users.

  • NBJack 2 hours ago

    This assumes your model is static and never needs to be improved or updated.

    Inference is cheap because the final model, despite its size, is ridiculously less resource intensive to use than it is to produce.

    ChatGPT in its latest form isn't bad by any means, but it is falling behind. And that requires significant overhead, both to train and to iterate on model architecture. It is often a variable cost as well.

    • HDThoreaun 6 minutes ago

      As long as revenue rises faster than training costs and inference remains profitable I dont think this is an issue. Eventually theyll be able to profitably amortize training across all the users.