Comment by marak830
Ollama is a good one, LM Studio is great for those who are unsure what to do (will help you get a model that fits into your system specs).
If you use open webui(I recommend via docker) you can access your ollama hosted model via the browser on any device on your network. Tailscale will help make that accessible remotely.
I'm currently working on an open source long term memory system designed to work with ollama to help local models be more competitive with the big players, so we are not so beholden to these big companies.
That sounds great — thank you for working on this. I’m not a developer, just curious about AI in general. Local AI feels like the right direction if we want to save energy and water, too. Is your memory system open source?