Comment by sixdimensional
Comment by sixdimensional 2 days ago
I totally agree. The reason I asked before offering any solution ideas was I was curious what you might think.
My brain went to the concept of memoization that we use to speed up function calls for common cases.
If you had a proxy that sat in front of the LLM and cached deterministic responses for inputs, with some way to maybe even give feedback when a response is satisfactory.. this could be a building block for a runtime design mode or something like that.