Comment by kvakkefly
Not sure if this is what you'd like, but I remember this repo from some time ago https://github.com/Storia-AI/sage
Storia-AI/sage: Chat with any codebase with 2 commands
Not sure if this is what you'd like, but I remember this repo from some time ago https://github.com/Storia-AI/sage
Storia-AI/sage: Chat with any codebase with 2 commands
Thanks for the suggestion! It needs some work to set up, and it looks like it only works on Github repos. Also, to work with non-local LLMs, you can only use Pinecone for vector storage. I might have misunderstood something, but I will check it out again later.