Comment by an0malous
If I write JavaScript that outputs “5+5=55” we would rightfully call that an error regardless of the implementation details. But when an LLM does it, you’re saying it’s not an error, it’s “generated tokens that fall within the statistical limits of the present configuration.”
My point was that, from the perspective of the end user, this is an error. If ChatGPT was described as a “random text generator” then sure maybe this wouldn’t be considered an error because “5+5=55” is random text. But that’s not what 90% or ChatGPT users expect nor how the app is marketed, it’s marketed as an Artificial Intelligence Assistant.