Comment by cbsmith
> Nobody wants to pay for a trillion dollar cloud bill.
Buying dedicated hardware as a way to keep your AI bill down seems like a tough proposition for your average consumer. Unless you're using AI constantly, renting AI capacity when you need it is just going to be cheaper. The win with the on-device model is you don't have to go out to the network in the first place.
You misunderstood what I meant, I mean make models that run on potatoes, nobody wants to pay what chatgpt's subscription model probably SHOULD cost for them to make a profit.