Comment by nearbuy
> Current generation of AI models can't think of anything truly new.
How could you possibly know this?
Is this falsifiable? Is there anything we could ask it to draw where you wouldn't just claim it must be copying some image in its training data?
Novelty in one medium arises from novelty in others, shifts to the external environment.
We got brass bands with brass instruments, synth music from synths.
We know therefore, necessarily, that they can be nothing novel from an LLM -- it has no live access to novel developments in the broader environment. If synths were invented after its training, it could never produce synth music (and so on).
The claim here is trivially falsifiable, and so obviously so that credulous fans of this technology bake it in to their misunderstanding of novelty itself: have an LLM produce content on developments which had yet to take place at the time of its training. It obviously cannot do this.
Yet an artist which paints with a new kind of black pigment can, trivially so.