Comment by cvzakharchenko

Comment by cvzakharchenko 10 months ago

9 replies

Oh, I’m yet to find a good alternative to Cursor’s RAG-powered side chat. It helps me work with huge codebases so much. Tried Continue, but it’s very unstable, and doesn’t work as well. Would prefer a command line solution, vscode plugin is the next choice, having a separate editor is not ideal, but I’m glad there’s some competition.

Terretta 10 months ago

Aider is AI pair programming in your terminal.

Aider lets you pair program with LLMs, to edit code in your local git repository. Start a new project or work with an existing git repo. Aider works best with GPT-4o & Claude 3.5 Sonnet and can connect to almost any LLM.

https://aider.chat/

See LLM Leaderboards as well: https://aider.chat/docs/leaderboards/

  • cvzakharchenko 10 months ago

    Thanks! I use Aider with Claude 3.5 Sonnet on smaller projects sometimes, and it's really fun and helpful when it can put a whole repo map into the LLM's context.

fredoliveira 10 months ago

> Would prefer a command line solution

I haven't tried this myself so I apologize if it ends up being bad, but I've seen Aider [0] get linked a few times from people who wanted a CLI solution for AI code completion.

[0]: https://aider.chat/

  • cvzakharchenko 10 months ago

    Aider is really cool for small projects, but it builds a repo map instead of using RAG. That works on small codebases, but totally fails to be useful on large ones.

    • Terretta 10 months ago

      How large?

      In use, treesitter-derived approach seems far effective than rag.

      • cvzakharchenko 10 months ago

        >5k source files. They don't fit into the context. I know I can limit what is sent, and I can attach files in the Aider chat myself. But this is not perfect for making an LLM answer questions about a codebase when I don't know much context beforehand. With Cursor, I can just do "@codebase How is a %feature% implemented?", and it's very quick and often helpful with a couple of follow-ups.

kvakkefly 10 months ago

Not sure if this is what you'd like, but I remember this repo from some time ago https://github.com/Storia-AI/sage

Storia-AI/sage: Chat with any codebase with 2 commands

  • cvzakharchenko 10 months ago

    Thanks for the suggestion! It needs some work to set up, and it looks like it only works on Github repos. Also, to work with non-local LLMs, you can only use Pinecone for vector storage. I might have misunderstood something, but I will check it out again later.

[removed] 10 months ago
[deleted]