Comment by jt2190

Comment by jt2190 a day ago

54 replies

Last paragraph is informative:

> Anthropic relies heavily on a combination of chips designed by Amazon Web Services known as Trainium, as well as Google’s in-house designed TPU processors, to train its AI models. Google largely uses its TPUs to train Gemini. Both chips represent major competitive threats to Nvidia’s best-selling products, known as graphics processing units, or GPUs.

So which leading AI company is going to build on Nvidia, if not OpenAI?

paxys a day ago

"Largely" is doing a lot of heavy lifting here. Yes Google and Amazon are making their own GPU chips, but they are also buying as many Nvidia chips as they can get their hands on. As are Microsoft, Meta, xAI, Tesla, Oracle and everyone else.

  • greiskul a day ago

    But is Google buying those GPU chips for their own use, or to have them on their data centers for their cloud customers?

    • dekhn a day ago

      google buys nvidia GPUs for cloud, I don't think they use them much or at all internally. The TPUs are both used internally, and in cloud, and now it looks like they are delivering them to customers in their own data centers.

      • hansvm a day ago

        When I was there a few years ago, we only got CPUs and GPUs for training. TPUs were in too high of demand.

      • moralestapia a day ago

        I can see them being used for training if they're vacant.

  • bredren a day ago

    How about Apple? How is Apple training its next foundation models?

    • consumer451 a day ago

      To use the parlance of this thread: "next" foundation models is doing a lot of heavy lifting here. Am I doing this right?

      My point is, does Apple have any useful foundation models? Last I checked they made a deal with OpenAI, no wait, now with Google.

      • wmf a day ago

        Apple does have their own small foundation models but it's not clear they require a lot of GPUs to train.

      • system2 a day ago

        I think Apple is waiting for the bubble to deflate, then do something different. And they have the ready to use user base to provide what they can make money from.

    • xvector a day ago

      Apple is sitting this whole thing out. Bizarre.

      • paxys 19 hours ago

        The options for a company in their position are:

        1. Sit out and buy the tech you need from competitors.

        2. Spend to the tune of ~$100B+ in infra and talent, with no guarantee that the effort will be successful.

        Meta picked option 2, but Apple has always had great success with 1 (search partnership with Google, hardware partnerships with Samsung etc.) so they are applying the same philosophy to AI as well. Their core competency is building consumer devices, and they are happy to outsource everything else.

      • catdog a day ago

        Well they tried and they failed. In that case maybe the smartest move is not to play. Looks like the technology is largely turning into a commodity in the long run anyways. So sitting this out and letting others make the mistakes first might not be the worst of all ideas.

      • runako a day ago

        This whole thread is about whether the most valuable startup of all time will be able to raise enough money to see the next calendar year.

        It's definitely rational to decide to pay wholesale for LLMs given:

        - consumer adoption is unclear. The "killer app" for OS integration has yet to ship by any vendor.

        - owning SOTA foundation models can put you into a situation where you need to spend $100B with no clear return. This money gets spent up front regardless of how much value consumers derive from the product, or if they even use it at all. This is a lot of money!

        - as apple has "missed" the last couple of years of the AI craze, there has been no meaningful ill effects to their business. Beyond the tech press, nobody cares yet.

      • vessenes a day ago

        I mean, they tried. They just tried and failed. It may work out for them, though — two years ago it looked like lift-off was likely, or at least possible, so having a frontier model was existential. Today it looks like you might be able to save many billions by being a fast follower. I wouldn’t be surprised if the lift-off narrative comes back around though; we still have maybe a decade until we really understand the best business model for LLMs and their siblings.

      • cs_sorcerer a day ago

        From a technology standpoint I don’t feel Apple’s core competency is in AI model foundations

    • downrightmike a day ago

      They are in housing their AI to sell it as a secure way to AI, which 100% puts them in the lead for the foreseeable future.

wmf a day ago

OpenAI will keep using Nvidia GPUs but they may have to actually pay for them.

Morromist a day ago

Nvidia had the chance to build its own AI software and chose not to. It was a good choice so far, better to sell shovels than go to the mines - but they still could go mining if the other miners start making their own shovels.

If I were Nvidia I would be hedging my bets a little. OpenAI looks like it's on shaky ground, it might not be around in a few years.

  • vessenes a day ago

    They do build their own software, though. They have a large body of stuff they make. My guess is that it’s done to stay current, inform design and performance, and to have something to sell enterprises along with the hardware; they have purposely not gone after large consumer markets with their model offerings as far as I can tell.

  • system2 a day ago

    There is no way Nvidia can make even a fraction of what they are making from AI software.

mcintyre1994 a day ago

That’s interesting, I didn’t know that about Anthropic. I guess it wouldn’t really make sense to compete with OpenAI and everyone else for Nvidia chips if they can avoid it.

dylan604 a day ago

Would Nvidia investing heavily in ClosedAI dissuade others to use Nvidia?

  • smileson2 a day ago

    Aren't they switching to PI for Pretend Intelligence?

  • irishcoffee a day ago

    If nothing else, the video game market would explode under AMD, maybe?

papichulo2023 21 hours ago

The elephant in the room is China also being partially successful with their chips

raincole a day ago

Literally all the other companies that still believe they can be the leading ones one day?

lofaszvanitt a day ago

The moment you threaten NVDA's livelyhood, your company starts to fall apart. History tells.

rvz a day ago

It's almost as if everyone here was assuming that Nvidia would have no competition for a long time, but it has been known for a long time, there are many competitors coming after their data center revenues. [0]

> So which leading AI company is going to build on Nvidia, if not OpenAI?

It's xAI.

But what matters is that there is more competition for Nvidia and they bought Groq to reduce that. OpenAI is building their own chips as well as Meta.

The real question is this: What happens when the competition catches up with Nvidia and takes a significant slice out of their data center revenues?

[0] https://news.ycombinator.com/item?id=45429514

dfajgljsldkjag a day ago

the chinese will probably figure out a way to sneak the nvidia chips around the sanctions

  • ekianjo a day ago

    Alibaba has their own chips now they use for training.