Comment by danpalmer

Comment by danpalmer 7 days ago

29 replies

Zed. They've upped their game in the AI integration and so far it's the best one I've seen (external from work). Cursor and VSCode+Copilot always felt slow and janky, Zed is much less janky feels like pretty mature software, and I can just plug in my Gemini API key and use that for free/cheap instead of paying for the editor's own integration.

vimota 7 days ago

I gave Zed an in-depth trial this week and wrote about it here: https://x.com/vimota/status/1921270079054049476

Overall Zed is super nice and opposite of janky, but still found a few of defaults were off and Python support still was missing in a few key ways for my daily workflow.

submeta 7 days ago

Consumes lots of resources on an M4 Macbook. Would love to test it though. If it didn’t freeze my Macbook.

Edit:

With the latest update to 0.185.15 it works perfectly smooth. Excellent addition to my setup.

  • _bin_ 7 days ago

    I'll second the zed recommendation, sent from my M4 macbook. I don't know why exactly it's doing this for you but mine is idling with ~500MB RAM (about as little as you can get with a reasonably-sized Rust codebase and a language server) and 0% CPU.

    I have also really appreciated something that felt much less janky, had better vim bindings, and wasn't slow to start even on a very fast computer. You can completely botch Cursor if you type really fast. On an older mid-range laptop, I ran into problems with a bunch of its auto-pair stuff of all things.

    • drcongo 7 days ago

      Yeah, same. Zed is incredibly efficient on my M1 Pro. It's my daily driver these days, and my Python setup in it is almost perfect.

  • enceladus06 3 days ago

    Are you running ollama local model or one of the zed llms?

charlie0 18 hours ago

Why are the Zeds guys so hung up on UI rendering times....? I don't care that the UI can render at 120FPS if it takes 3 seconds to get input from an LLM. I do like the clean UI though.

xmorse 6 days ago

I am using Zed too, it still has some issues but it is comparable to Cursor. In my opinion they iterate even faster than the VSCode forks.

  • DrBenCarson 6 days ago

    Yep not having to build off a major fork will certainly help you move fast

allie1 6 days ago

I just wish they'd release a debugger already. Once its done i'll be moving to them completely.

frainfreeze 7 days ago

Zed doesn't even run on my system and the relevant github issue is only updated by people who come to complain about the same issue.

  • Aeolun 6 days ago

    Don’t use windows? I don’t feel like that’s a terribly uncommon proposition for a dev.

  • KomoD 6 days ago

    Windows? If so, you can run it, you just have to build it.

    • frainfreeze 5 days ago

      Debian latest stable.

      • KomoD 5 days ago

        Oh, then what's the issue? I'm using Zed on Mint and so far I've only had one issue, the window being invisible (which I fixed by updating GPU drivers)

wellthisisgreat 7 days ago

Does it have Cursor’s “tab” feature?

  • dvtfl 7 days ago
    • eadz 7 days ago

      It would be great if there was an easy way to run their open model (https://huggingface.co/zed-industries/zeta) locally ( for latency reasons ).

      I don't think Zeta is quite up to windsurf's completion quality/speed.

      I get that this would go against their business model, but maybe people would pay for this - it could in theory be the fastest completion since it would run locally.

      • rfoo 6 days ago

        > the fastest completion since it would run locally

        We are living in a strange age that local is slower than the cloud. Due to the sheer amount of compute we need to do. Compute takes hundreds of milliseconds (if not seconds) on local hardware, making 100ms of network latency irrelevant.

        Even for a 7B model your expensive Mac or 4090 can't beat, for example, a box with 8x A100s running FOSS serving stack (sglang) with TP=8, in latency.

      • xmorse 7 days ago

        Running models locally is very expensive in terms of memory and scheduling requirements, maybe instead they should host their model on the Cloudflare AI network which is distributed all around the world and can have lower latency

  • Aeolun 6 days ago

    Sort of. The quality is light and day different (cursor feels like magic, Zed feels like a chore).

    • atonse 6 days ago

      I can second this. I really do want to move to Zed full time but the code completion is nowhere near as useful or "smart" as cursor's yet.

    • vendiddy 5 days ago

      Yep I want Zed to win but it has not yet become my daily driver