Comment by HDThoreaun

Comment by HDThoreaun 8 hours ago

2 replies

Inference is profitable for openai as far as I can tell. They dont need to jack up prices much, what they really need is users who are paying/consuming ads. Theyre burning money on free tier users and data center expansion so they can serve more users.

NBJack 2 hours ago

This assumes your model is static and never needs to be improved or updated.

Inference is cheap because the final model, despite its size, is ridiculously less resource intensive to use than it is to produce.

ChatGPT in its latest form isn't bad by any means, but it is falling behind. And that requires significant overhead, both to train and to iterate on model architecture. It is often a variable cost as well.

  • HDThoreaun 8 minutes ago

    As long as revenue rises faster than training costs and inference remains profitable I dont think this is an issue. Eventually theyll be able to profitably amortize training across all the users.