Comment by MattRix
Well no, you just need to tune the taste of the model to produce things that humans find appealing. This has already happened with the image generation models. I don’t see any reason it can’t happen with these code generation models too.
The whole thing feels a bit like god-of-the-gaps situation, where we keep trying to squeeze humanity into whatever remaining little gaps the current generation of AI hasn’t mastered yet.