Comment by alexfromapex
Comment by alexfromapex 3 days ago
I have a MacBook M3 Max with 128 GB unified RAM. I use Ollama with Open Web UI. It performs very well with models up to 80B parameters but it does get very hot with models over 20B parameters.
I use it to do simple text-based tasks occasionally if my Internet is down or ChatGPT is down.
I also use it in VS Code to help with code completion using the Continue extension.
I created a Firefox extension so I can use Open WebUI in my browser by pressing Cmd+Shift+Space too when I am browsing the web and want to ask a question: https://addons.mozilla.org/en-US/firefox/addon/foxyai/