Comment by unleaded
i'm guessing you've never seen r/LocalLLaMA?
It's a miracle that open-weight LLMs are even a thing at all, let alone as good as they are (very).
i'm guessing you've never seen r/LocalLLaMA?
It's a miracle that open-weight LLMs are even a thing at all, let alone as good as they are (very).
You need thousands of dollars of hardware to run a decent coding model with bearable tokens/s.