Comment by tefkah
Theoretically it would be much less expensive to just continue to run the existing models, but ofc none of the current leaders are going to stop training new ones any time soon.
Theoretically it would be much less expensive to just continue to run the existing models, but ofc none of the current leaders are going to stop training new ones any time soon.
So are we on a hockey stick right now where a new model is so much better than the previous that you have to keep training?
Because almost every example of previous cases of things like this eventually leveled out.