codingmoh 2 days ago

Thanks so much!

Was the model too big to run locally?

That’s one of the reasons I went with phi-4-mini - surprisingly high quality for its size and speed. It handled multi-step reasoning, math, structured data extraction, and code pretty well, all on modest hardware. Phi-1.5 / Phi-2 (quantized versions) also run on raspberry pi as others have demonstrated.

  • xyproto 2 days ago

    The models work fine with "ollama run" locally.

    When trying out "phi4" locally with:

    open-codex --provider ollama --full-auto --project-doc README.md --model phi4:latest

    I get this error:

          OpenAI rejected the request. Error details: Status: 400, Code: unknown, Type: api_error, Message: 400
        registry.ollama.ai/library/phi4:latest does not support tools. Please verify your settings and try again.
smcleod 2 days ago

That's a really old model now. Even the old Qwen 2.5 coder 32b model is better than DSv2