Comment by denismi

Comment by denismi 2 days ago

3 replies

> Out of the five options available, only one is European (the one I am using). What I don't like is how I cannot add my own custom endpoint. What if I run Mistral locally (with Ollama, for example) and want to use that?

Set up your preferred self-hosted web interface (OpenWebUI or whatever, I haven't looked into this for a while), point it at ollama, and then configure it in Firefox:

browser.ml.chat.provider = http://localhost:3000/

At home I point this at Kagi Assistant, at work I point it to our internal GenAI platform's chat endpoint.

denkmoon a day ago

Out of curiosity, for the AI inept, how does this work? I can just point firefox at "https://kagi.com/assistant" and it can use it? Is that using MCP or is there some other standard interface for this?

  • jeroenhd a day ago

    Most of these AI providers use a similar kind of common query structure. OpenWebUI is a mostly consistent copy of ChatGPT so that's what the browser seems to default to when you configure something custom.

    All the AI toolbar really does is open http://ai.url.com/some-query?prompt=${formattedPrompt} and display it next to the web page you have open.

    The formatted prompt is something like "The user is on the page 'Stories about Cats'. The user wants you to summarize the following text: <text you have selected goes here>". You can configure your own prompt in about:config if you want, there are a bunch of examples here: https://github.com/mozilla-l10n/firefox-l10n/blob/main/en-GB...

    There are prompts optimised for specific AI providers but the generic ones should work with any provider your choose.

    When the web page opens at that URL, you're either going to get redirected to login and then redirected back, or the AI frontend will start executing the prompt.

  • KetoManx64 a day ago

    It's functions as a mini browser window without a URL bar