Comment by tarruda

Comment by tarruda 10 days ago

3 replies

> Download as many LLM models and the latest version of Ollama.app and all its dependencies.

I recently purchased a Mac Studio with 128gb RAM for the sole purpose of being able to run 70b models at 8-bit quantization