Comment by Gracana
This looks quite nice. I would like to see the system prompt and inference parameters exposed in the UI, because those are things I'm used to fiddling with in other UIs. Is that something that the llm library supports?
This looks quite nice. I would like to see the system prompt and inference parameters exposed in the UI, because those are things I'm used to fiddling with in other UIs. Is that something that the llm library supports?
Yeah absolutely, I've just got to point where I'm happy with the architecture so I'll continue to add UI. I've just added support for fragments and I've thought to add them as if they were attached documents. I've in the radar to switch models in mid conversation and perhaps the ability to rollback a conversation or remove some messages. But yeah, system prompt and parameters would be nice to move too! Thanks for the suggestions!