Comment by matula

Comment by matula 10 hours ago

1 reply

Very nice. I tried with Ollama and it works well.

The biggest issue is having the Ollama models hardcoded to Qwen3 and Llama 3.1. I imagine most Ollama users have their favorites, and probably vary quite a bit. My main model is usually Gemma 3 12B, which does support images.

It would be a nice feature to have a custom config on the Ollama settings page, save those to Chrome storage, and use that in the 'getAvailableModels' method, along with the hardcoded models.

parsabg 10 hours ago

Great suggestion, will add custom Ollama configurations to the next release