Comment by theshrike79

Comment by theshrike79 4 days ago

1 reply

We all want to move to local models eventually for privacy and reliability.

They don't (and won't) have infinite context without trickery or massive €€€ use.

The current crop of online LLMs are just running on VC money slightly tapered with subscriptions - but still at a loss. The hype and money will run out, so use them as much as possible now. But also keep your workflows so that they will work locally when the time comes.

Don't be that 10x coder who becomes a 0.1x coder when Anthropic has issues on their side =)

cyanydeez 4 days ago

I don't see how anyone could make a successful product build on cloud LLMs, even if you get a perfect workflow, you'll either be gouged with price rises, or lose out to model changes and context/prompt divergence. All this "prompt" nonsense is simply trying to play to the LLM audience, and no amount of imprecise prompt will negate the fundamental instability.

So yeah, you have to use a localLLM if you think there's a viable product to be had. Anyone whose been programming knows that once you get to the mile mark of a complete & finished project, it can be mothballed for decades generating utility and requiring limited maintenance. All that goes out the window if you require a cloud provider to remain stable for a decade.