spoonfeeder006 7 days ago

Well, arguably so do LLMs. You put in the same input prompt, out comes the same code. But yeah, it is kinda different, but I'm just saying that as LLMs become better at understanding how to solve sub-problems, they may become reliable enough that coding via LLMs becomes the new norm, and learning how to effectively prompt will become a new skill for coders