Comment by parsabg
yes, the LLM can invoke observation tools (e.g. read the text/DOM or take a screenshot) to retrieve the context it needs to take the next action
yes, the LLM can invoke observation tools (e.g. read the text/DOM or take a screenshot) to retrieve the context it needs to take the next action
You can use Ollama as the backend so the data never leaves your computer.
Also, the line is blurry for some people on “privacy” when it comes to LLMs. I think some people, not me, think that if you are talking directly to the LLM provider API then that’s “private” whereas talking to a service that talks to the LLM is not.
And, to be fair, some people use privacy/private/etc language for products that at least have the option of being private (Ollama).
> How is it “privacy-first” then if it literally sends all your shit to the LLM?
Because it supports Ollama, which runs the LLM entirely locally on your own hardware, thus data sent to it never leaves your machine?
Edit: joshstrange beat me to the same conclusion by mere moments. :)
So maybe something we want to be mindful of before using this on banking, health, etc.
How is it “privacy-first” then if it literally sends all your shit to the LLM?