Comment by cdecl
> LLMs are just text prediction. That's what they are.
This sort of glib talking point really doesn't pass muster, because if you showed the current state of affairs to a random developer from 2015, you would absolutely blow their damned socks off.