charlie0 2 days ago

Lol, this was my second thought immediately after my first, which was one of excitement. Hope the author does add a option for local. Wonder how that would work as a Chrome extension. Doesn't seem like a good idea for extensions to be accessing local resources though.

  • mdaniel 2 days ago

    > Doesn't seem like a good idea for extensions to be accessing local resources though.

    To the best of my knowledge all localhost connections are exempt from CORS and that's in fact how the 1Password extension communicates with the desktop app. I'd bet Bitwarden and KeePassXC behave similarly

  • fph 2 days ago

    You can self-host Languagetool and use it as a Chrome/Firefox extension. The extension talks to a Languagetool server via HTTP, and takes its address as a configurable option. So you just run the local server, and pass localhost:8080 as the server address.

sharemywin 2 days ago

settings for opting out of training etc. for OpenAI

https://help.openai.com/en/articles/7730893-data-controls-fa...

segmondy 2 days ago

much ado about nothing, the code is there, edit it and use a local AI.

Eisenstein 2 days ago

Download koboldcpp and llama3.1 gguf weights, use it with the llama3 completions adapter.

Edit the 'background.js' file in the extension and replace the openAI endpoint with

'http://your.local.ip.addr:5001/v1/chat/completions'

Set anything you want as an API key. Now you have a truly local version.

* https://github.com/LostRuins/koboldcpp/releases

* https://huggingface.co/bartowski/Meta-Llama-3.1-8B-Instruct-...

* https://github.com/LostRuins/koboldcpp/blob/concedo/kcpp_ada...