Comment by jamalaramala
Comment by jamalaramala a day ago
> Even more concerning was Devin’s tendency to press forward with tasks that weren’t actually possible. (...)
> Devin spent over a day attempting various approaches and hallucinating features that didn’t exist.
One of the big problems of GenAI is its inability to know what they don't know.
Because of that, they don't ask clarifying questions.
Humans, in the same situation, would spend a lot of time learning before they could be truly productive.
Your statement is factually wrong, Claude 3.5v2 asks clarifying questions when needed "natively", and you can add similar instructions in your prompt for any model.