Comment by deelowe

Comment by deelowe 20 hours ago

4 replies

Not exactly. Infra will win the race. In this aspect, Google is miles ahead of the competition. Their DC solutions scale very well. Their only risk is that the hardware and low level software stack is EXTREMELY custom. They don't even fully leverage OCP. Having said that, this has never been a major problem for Google over their 20+ years of moving away from OTS parts.

amelius 20 hours ago

But anyone with enough money can make infra. Maybe not at the scale of Google, but maybe that's not necessary (unless you have a continuous stream of fresh high-quality training data).

  • shaftway 16 hours ago

    Anyone with enough money can cross any moat. That's one of the many benefits of having infinite money.

  • piva00 20 hours ago

    If making infra means designing their own silicon to target only inference instead of more general GPUs I can agree with you, otherwise the long-term success is based on how cheap they can run the infra compared to competitors.

    Depending on Nvidia for your inference means you'll be price gouged for it, Nvidia has a golden goose for now and will milk it as much as possible.

    I don't see how a company without optimised hardware can win in the long run.

    • amelius 19 hours ago

      The silicon can be very generic. I don't see why prices of "tensor" computation units can't go down if the world sees the value in them, just like how it happened with CPUs.