Comment by river_otter

Comment by river_otter 6 hours ago

0 replies

THIS!!! I'm right there with you. Imo, using aoe (or whatever tool you can find for session management) is the key unlock to help give us all the patience to try out the ollama/lmstudio/mlx providers to see how they really perform. We're lacking information about what work is really being done with local coding models, at least somewhat because imo most github open source repos don't ask you to share what LLM you used to help you, and they should.

We need to start sharing our specific success and failure stories from using local llms for coding