Comment by gcr

Comment by gcr 3 days ago

9 replies

For new folks, you can get a local code agent running on your Mac like this:

1. $ npm install -g @openai/codex

2. $ brew install ollama; ollama serve

3. $ ollama pull gpt-oss:20b

4. $ codex --oss -m gpt-oss:20b

This runs locally without Internet. Idk if there’s telemetry for codex, but you should be able to turn that off if so.

You need an M1 Mac or better with at least 24GB of GPU memory. The model is pretty big, about 16GB of disk space in ~/.ollama

Be careful - the 120b model is 1.5× better than this 20b variant, but takes 5× higher requirements.

windexh8er 2 days ago

I've been really impressed by OpenCode [0]. The limitations of all the frontier TUI is removed and it is feature complete and performant compared to Codex or Claude Code.

[0] https://opencode.ai/

  • ponyous 4 hours ago

    What kind of API subscription are you using? I found opencode to be incredibly expensive - prompts costing $5, while with aider I did it for <$0.1.

  • embedding-shape 2 days ago

    > OpenCode will be available on desktop soon

    Anyone happen to know what that means exactly? The install instructions at the top seems to indicate it already is available on desktop?

    • windexh8er 2 days ago

      It's a terminal only (TUI) tool today. They're releasing a graphical (GUI) version in the future.

      • embedding-shape 2 days ago

        > It's a terminal only (TUI) tool today.

        But to use that TUI you need a desktop, or at least a laptop I guess, but that distinction doesn't make sense. Are they referring to the GUI being the "Desktop Version"? Never heard it put that way before if so.

        • windexh8er 10 hours ago

          > But to use that TUI you need a desktop...

          No, you don't need a "desktop" to use a TUI. It's terminal based and has nothing to do with the desktop environment you're in.

          Alao, if you have a "desktop" that assumes you're using a GUI. Pretty straightforward.

nickthegreek 3 days ago

have you been able to build or reiterate anything of value using just 20b to vibe code?

abacadaba 2 days ago

As much as I've been using llms via api all day every day, being able to run it locally on my mba and talk to my laptop still feels like magic

giancarlostoro 3 days ago

LM Studio is even easier, and things like JetBrains IDEs will sync to LM Studio, same with Zed.