Comment by 0xbadcafebee
Comment by 0xbadcafebee 2 days ago
If you have an old M1 Macbook lying around, you use that to run a local model. Then it only costs whatever the electricity costs. May not be a frontier model, but local models are insanely good now compared to before. Some people are buying Mac Minis for this, but there's many kinds of old/cheap hardware that works. An old 1U/2U server some company's throwing out with a tech refresh, lots of old RAM, an old GPU off eBay, is pretty perfect. MacBook M1 Max or Mac Mini w/64GB RAM is much quieter, power efficient, compact. But even my ThinkPad T14s runs local models. Then you can start optimizing inference settings and get it to run nearly 2x faster.
(keep in mind with the cost savings: do an initial calculation of your cloud cost first with a low-cost cloud model, not the default ones, and then multiply times 1-2 years, compare that cost to the cost of a local machine + power bill. don't just buy hardware because you think it's cheaper; cloud models are generally cost effective)
> don't just buy hardware because you think it's cheaper
Surely there is also the benefit of data privacy and not having a private company creating yet another ad profile of me to sell later on?