Comment by calcsam

Comment by calcsam 5 months ago

1 reply

Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.

tough 5 months ago

can confirm this works well with any OpenAI like API endpoint, like ollama or LM studio's