Show HN: Light like the Terminal – Meet GTK LLM Chat Front End
(github.com)35 points by icarito 2 days ago
Author here. I wanted to keep my conversation with #Gemini about code handy while discussing something creative with #ChatGPT and using #DeepSeek in another window. I think it's a waste to have Electron apps and so wanted to chat with LLMs on my own terms. When I discovered the llm CLI tool I really wanted to have convenient and pretty looking access to my conversations, and so I wrote gtk-llm-chat - a plugin for llm that provides an applet and a simple window to interact with LLM models.
Make sure you've configure llm first (https://llm.datasette.io/en/stable/)
I'd love to get feedback, PRs and who knows, perhaps a coffee! https://buymeacoffee.com/icarito
It’d be better if it was written in C or at least Vala. With Python, you have to wait a couple hundred milliseconds for the interpreter to start, which makes it feel less native than it can be. That said, the latency of the LLM responses is higher than the UI, so I guess the slowness of Python doesn’t matter.