Comment by joshstrange

Comment by joshstrange 11 hours ago

0 replies

You can use Ollama as the backend so the data never leaves your computer.

Also, the line is blurry for some people on “privacy” when it comes to LLMs. I think some people, not me, think that if you are talking directly to the LLM provider API then that’s “private” whereas talking to a service that talks to the LLM is not.

And, to be fair, some people use privacy/private/etc language for products that at least have the option of being private (Ollama).