Comment by jbmsf

Comment by jbmsf 2 days ago

6 replies

Interesting. We're using a SAAS solution for document extraction right now. I don't know if it's in our interest to build out more but I do like the idea of keeping extraction local.

jgalt212 2 days ago

Our customers insist we run everything on their docs locally.

  • fzysingularity 2 days ago

    Absolutely, we’ve been hearing the same from our customers - which is why we thought it makes sense to open source a bunch of schemas so that they’re reusable and compatible across various inference providers (esp. Ollama/local ones).

fzysingularity 2 days ago

Cool, what types of documents do you currently handle? We could share some of our learnings/schemas here too.