Comment by codingmoh
Thanks so much!
Was the model too big to run locally?
That’s one of the reasons I went with phi-4-mini - surprisingly high quality for its size and speed. It handled multi-step reasoning, math, structured data extraction, and code pretty well, all on modest hardware. Phi-1.5 / Phi-2 (quantized versions) also run on raspberry pi as others have demonstrated.
The models work fine with "ollama run" locally.
When trying out "phi4" locally with:
open-codex --provider ollama --full-auto --project-doc README.md --model phi4:latest
I get this error: