Comment by deepsquirrelnet

Comment by deepsquirrelnet 14 hours ago

1 reply

One of the issues with using LLMs in content generation is that instruction tuning causes mode collapse. For example, if you ask an LLM to generate a random number between 1 and 10, it might pick something like 7 80% of the time. Base models do not exhibit the same behavior.

“Creative Output” has an entirely different meaning when you start to think about them in the way they actually work.

orbital-decay 5 hours ago

Creativity is a really ill-defined term, but generally it has a lot more to do with abstract thinking and understanding subtlety and nuance than with mode collapse. Mode collapse affects variation, which is probably a part of of creativity for some definitions of it, but they aren't the same at all.