Comment by tarruda
> Download as many LLM models and the latest version of Ollama.app and all its dependencies.
I recently purchased a Mac Studio with 128gb RAM for the sole purpose of being able to run 70b models at 8-bit quantization
> Download as many LLM models and the latest version of Ollama.app and all its dependencies.
I recently purchased a Mac Studio with 128gb RAM for the sole purpose of being able to run 70b models at 8-bit quantization
M4?