Comment by zlwaterfield
Comment by zlwaterfield 2 days ago
Correct but I'm going to loom into a locally running LLM so it would be free.
Comment by zlwaterfield 2 days ago
Correct but I'm going to loom into a locally running LLM so it would be free.
Look into allowing it to connect to either a LM Studio endpoint or ollama please.
Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.
That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).