Comment by jgalt212 Comment by jgalt212 2 days ago 1 reply Copy Link View on Hacker News Our customers insist we run everything on their docs locally.
Copy Link fzysingularity 2 days ago Collapse Comment - Absolutely, we’ve been hearing the same from our customers - which is why we thought it makes sense to open source a bunch of schemas so that they’re reusable and compatible across various inference providers (esp. Ollama/local ones). Reply View | 0 replies
Absolutely, we’ve been hearing the same from our customers - which is why we thought it makes sense to open source a bunch of schemas so that they’re reusable and compatible across various inference providers (esp. Ollama/local ones).