Comment by sowbug

Comment by sowbug a day ago

4 replies

I wouldn't mind my own offline Gemini or ChatGPT 5. But even if the hardware and model were free, I don't know how I'd afford the electricity.

mitthrowaway2 a day ago

If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!

But anyway, the trick is to run it in the winter and keep your house warm.

  • sowbug a day ago

    I think you're underestimating economies of scale, and today's willingness of large corporations to provide cutting-edge services at a loss.

    • mitthrowaway2 11 hours ago

      I don't think I am. I don't think economies of scale on hardware will drive costs below free, and while subscription providers might be willing to offer services below the cost of the hardware that runs them, I don't think they'll offer services below the cost of the electricity that runs them.

      And while data centers might sign favorable contracts, I don't think they are getting electricity that far below retail.

jdprgm a day ago

A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.