Knowledge graphs using Ollama and Embeddings to answer and visualizing queries
(github.com)122 points by punnerud 10 months ago
122 points by punnerud 10 months ago
If you don't want to make direct API calls, there are actual official Ollama python bindings[1]. Cool project though!
Nice, thanks for the feedback. I have a prototype of also using the embeddings for categorizing the steps, with "tags/labels". Almost take it as a challenge to be able to reason better with a smaller modell than those >70B that you can not run on your own laptop.
I actually built something similar to this a couple days ago for finding duplicate bugs in our gh repo. Some differences:
* I used json to store the blobs in sqlite instead of converting it to byte form (I think they're roughly equivalent in the end?) * For the distances calculations I use `numpy.linalg.norm(a-b)` to subtract the two vectors and then take the normal * `ollama.embed()` and `ollama.generate()` will cut down on the requests code
Can you use https://github.com/abetlen/llama-cpp-python or you need something ollama provide ?
speaking of embeddings, you saw https://jina.ai/news/jina-embeddings-v3-a-frontier-multiling... ?
Switching to a low level integration will probably not improve the speed, the waiting is primarily on the llama generation of text.
Should be easy to switch embeddings.
Already playing with adding different tags to previous answers using embeddings, then using that to improve the reasoning.
You should probably have a requirements.txt file instead of just a list of requirements. It's often hard to tell which combination of package versions will 'actually' work when running these things