Comment by cratermoon

Comment by cratermoon 5 days ago

3 replies

> AI labs are pursuing huge compute ramp-ups to scale training

Yeah, and many, not just Marcus, are doubtful that the huge ramp-ups and scale will yield proportional gains. If you have evidence otherwise, share it.

int_19h 5 days ago

The point is that those ramp-ups indicate that quite a few people do believe that they will yield gains, if not proportional, then still large enough to justify the expense. Which is to say, the claim that "even the best funded teams are moving to hybrid approaches" is not evidence of anything.

  • skeledrew 5 days ago

    Believing that something is the case doesn't make it so. And the available evidence is saying it isn't more than it is, which is the point. Maybe it so happens that there's another sudden leap with X amount more scaling, but the only thing anyone has regarding that is faith. Faith is all that's maintaining the bubble.

KaoruAoiShiho 4 days ago

No shit it doesn't offer proportional gains, this was part of the scaling laws from the very beginning. There are of course diminishing returns, it doesn't mean it's not worth pursuing or that there won't be useful returns from scaling.

Everyone out there is saying these reports are very misleading. Pretty sure it's just sensationalizing the known diminishing returns.