Comment by nodja

Comment by nodja 2 hours ago

1 reply

Because success for them doesn't mean it works, it means it works much better than what they currently have. If a 1% improvement comes at the cost of spending 10x more on training and 2x more on inference then you're failing at runs. (numbers out of ass)

mrweasel an hour ago

That makes sense. It's not that the training didn't complete or returned a moronic model, but the capabilities have plateaued.