Comment by Grimblewald
Comment by Grimblewald 4 days ago
I highly recommend. As a tip, you can quite easily get into a chat like state by simply using in context learning. Have a few turns of conversation pre-written and generate from that. It'll continue the conversation (for both parties) so you just stop it from generating when it starts generating on your behalf.
That said, it's useful for so much more beyond. Outline the premise of a Book, then "what follows is that book\n #Chapter 1:" and watch it rip. Base models are my preferred way of using LLM's by a long margin.