Comment by 9dev
And… that is called overfitting. If you show the model values for y, but they are 2 in 99% of all cases, it’s likely going to yield 2 when asked about the value of y, even if the prompt didn’t specify or forbid 2 specifically.
And… that is called overfitting. If you show the model values for y, but they are 2 in 99% of all cases, it’s likely going to yield 2 when asked about the value of y, even if the prompt didn’t specify or forbid 2 specifically.
If you take the perspective of all the possible responses to the request, then it is overfit because it only returns a non-generalized response.
But if you look at it from the perspective that there is only one example to learn, from it is maybe not over it.
> If you show the model values for y, but they are 2 in 99% of all cases, it’s likely going to yield 2 when asked about the value of y
That's not overfitting. That's either just correct or underfitting (if we say it's never returning anything but 2)!
Overfitting is where the model matches the training data too closely and has inferred a complex relationship using too many variables where there is really just noise.