Comment by fzysingularity
Comment by fzysingularity 2 days ago
Is Qwen2.5-VL on Ollama? Could give it a try with a few of the schemas we have.
We’ve locally tested with Llama 3.2 11B Vision on Ollama: https://github.com/vlm-run/vlmrun-hub/blob/main/tests/benchm...
FWIW I think Ollama structured outputs API is quite buggy compared to the HF transformers variant.