HN Top New Show Ask Jobs

settings

Theme

Hand Mode

Feed

Comment by segmenta

Comment by segmenta 4 days ago

2 replies

View on Hacker News

Yes - you can use local LLMs through LiteLLM and Ollama. Would you like us to support anything else?

thedangler 4 days ago

LM Studio?

Reply View | 1 reply
  • ramnique 4 days ago

    Yes, because LM Studio is openai-compatible. When you run rowboatx the first time, it creates a ~/.rowboat/config/models.json. You can then configure LM Studio there. Here is an example: https://gist.github.com/ramnique/9e4b783f41cecf0fcc8d92b277d...

    Reply View | 0 replies