oblio 14 days ago

This assumes that the value obtained by customers is high enough to cover any possible actual cost.

Many current AI uses are low value things or one time things (for example CV generation, which is killing online hiring).

  • aurareturn 14 days ago

      Many current AI uses are low value things or one time things (for example CV generation, which is killing online hiring).
    
    We are talking about Pro subs who have high usage.
    • oblio 14 days ago

      True.

      At the end of the day, until at least one of the big providers gives us balance sheet numbers, we don't know where they stand. My current bet is that they're losing money whichever way you dice it.

      The hope being as usual that costs go down and the market share gained makes up for it. At which point I wouldn't be shocked by pro licenses running into the several hundred bucks per month.

gitremote 15 days ago

Currently, they lose more money per inference than they make for Pro subscriptions, because they are essentially renting out their service each month instead of charging for usage (per token).

  • aurareturn 14 days ago

    Do you have a source for that?

    • gitremote 14 days ago

      When an end user asks ChatGPT a question, the chatbot application sends the system prompt, user prompt, and context as input tokens to an inference API, and the LLM generates output tokens for the inference API response.

      GPT API inference cost (for developers) is per token (sum of input tokens, cached input tokens, and output tokens per 1M used).

      https://openai.com/api/pricing/

      https://azure.microsoft.com/en-us/pricing/details/cognitive-...

      (Inference cost is charged per token even for free models like Meta LLaMa and DeepSeek-R1 on Amazon Bedrock. https://aws.amazon.com/bedrock/pricing/ )

      ChatGPT Pro subscription pricing (the chatbot for end users) is $200/month

      https://openai.com/chatgpt/pricing/

      "insane thing: we are currently losing money on openai pro subscriptions!

      people use it much more than we expected."

      - Sam Altman, January 6, 2025

      https://xcancel.com/sama/status/1876104315296968813

      Again, this means that the average ChatGPT Pro end user's chattiness cost OpenAI too much inference (too many input and output tokens sent and received, respectively, for inference) per month than would be balanced out by OpenAI receiving $200/month in revenue from the average Pro user.

      The analogy is like Netflix losing money on their subscriptions because their users watch too much streaming, so they ban account sharing, causing many users to cancel their subscriptions, but this actually helps them become profitable, because the extra users using their service too much generated more costs than revenue.