Comment by stephc_int13

Comment by stephc_int13 20 hours ago

104 replies

Everyone is trying to compare AI companies with something that happened in the past, but I don't think we can predict much from that.

GPUs are not railroads or fiber optics.

The cost structure of ChatGPT and other LLM based services is entirely different than web, they are very expensive to build but also cost a lot to serve.

Companies like Meta, Microsoft, Amazon, Google would all survive if their massive investment does not pay off.

On the other hand, OpenAI, Anthropic and others could be soon find themselves in a difficult position and be at the mercy of Nvidia.

wood_spirit 20 hours ago

Unlike railroads and fibre, all the best compute in 2025 will be lacklustre in 2027. It won’t retain much value in the same way as the infrastructure of previous bubbles did?

  • christina97 19 hours ago

    The A100 came out 5.5 years ago and is still the staple for many AI/ML workloads. Even AI hardware just doesn’t depreciate that quickly.

    • Ianjit 7 hours ago

      Users are waiting for Blackwell. Then Rubin. CRWV depreciates GPUs over 6 years. Rails last a lot longer.

    • littlestymaar 6 hours ago

      Unless you care about FLOP/Watt, which big players definitely do.

    • oblio 13 hours ago

      Don't they degrade physically from being run at full blast 24/7 for so many years?

    • dzhiurgis 16 hours ago

      This. There’s even a market for them being built (DRW).

  • layoric 20 hours ago

    > Unlike railroads and fibre, all the best compute in 2025 will be lacklustre in 2027.

    I definitely don't think compute is anything like railroads and fibre, but I'm not so sure compute will continue it's efficiency gains of the past. Power consumption for these chips is climbing fast, lots of gains are from better hardware support for 8bit/4bit precision, I believe yields are getting harder to achieve as things get much smaller.

    Betting against compute getting better/cheaper/faster is probably a bad idea, but fundamental improvements I think will be a lot slower over the next decade as shrinking gets a lot harder.

    • palmotea 19 hours ago

      >> Unlike railroads and fibre, all the best compute in 2025 will be lacklustre in 2027.

      > I definitely don't think compute is anything like railroads and fibre, but I'm not so sure compute will continue it's efficiency gains of the past. Power consumption for these chips is climbing fast, lots of gains are from better hardware support for 8bit/4bit precision, I believe yields are getting harder to achieve as things get much smaller.

      I'm no expert, buy my understanding is that as feature sizes shrink, semiconductors become more prone to failure over time. Those GPUs probably aren't going to all fry themselves in two years, but even if GPUs stagnate, chip longevity may limit the medium/long term value of the (massive) investment.

    • spiderice 19 hours ago

      Unfortunately changing 2027 to 2030 doesn't make the math much better

      • JumpCrisscross 14 hours ago

        > changing 2027 to 2030 doesn't make the math much better

        Could you show me?

        Early turbines didn't last that long. Even modern ones are only rated for a few decades.

    • skywhopper 19 hours ago

      Unfortunately the chips themselves probably won’t physically last much longer than that under the workloads they are being put to. So, yes, they won’t be totally obsolete as technology in 2028, but they may still have to be replaced.

      • munk-a 18 hours ago

        Yeah - I think that the extremely fast depreciation just due to wear and use on GPUs is pretty unappreciated right now. So you've spent 300 mil on a brand new data center - congrats - you'll need to pay off that loan and somehow raise another 100 mil to actually maintain that capacity for three years based on chip replacement alone.

        There is an absolute glut of cheap compute available right now due to VC and other funds dumping into the industry (take advantage of it while it exists!) but I'm pretty sure Wall St. will balk when they realize the continued costs of maintaining that compute and look at the revenue that expenditure is generating. People think of chips as a piece of infrastructure - you buy a personal computer and it'll keep chugging for a decade without issue in most case - but GPUs are essentially consumables - they're an input to producing the compute a data center sells that needs constant restocking - rather than a one-time investment.

        • davedx 5 hours ago

          There are some nuances there.

          - Most big tech companies are investing in data centers using operating cash flow, not levering it

          - The hyperscalers have in recent years been tweaking the depreciation schedules of regular cloud compute assets (extending them), so there's a push and a pull going on for CPU vs GPU depreciation

          - I don't think anyone who knows how to do fundamental analysis expects any asset to "keep chugging for a decade without issue" unless it's explicitly rated to do so (like e.g. a solar panel). All assets have depreciation schedules, GPUs are just shorter than average, and I don't think this is a big mystery to big money on Wall St

      • chermi 16 hours ago

        Do we actually know how they're degrading? Are there still Pascals out there? If not, is it because they actual broke or because of poor performance? I understand it's tempting to say near 100% workload for multiple years = fast degradation, but what are the actual stats? Are you talking specifically about the actual compute chip or the whole compute system -- I know there's a big difference now with the systems Nvidia is selling. How long do typical Intel/AMD CPU server chips last? My impression is a long time.

        If we're talking about the whole compute system like a gb200, is there a particular component that breaks first? How hard are they to refurbish, if that particular component breaks? I'm guessing they didn't have repairability in mind, but I also know these "chips" are much more than chips now so there's probably some modularity if it's not the chip itself failing.

      • epolanski 16 hours ago

        I'm not sure.

        Number of cycles that goes through silicon matters, but what matters most really are temperature and electrical shocks.

        If the GPUs are stable, at low temperature they can be at full load for years. There are servers out there up from decades and decades.

    • [removed] 10 hours ago
      [deleted]
  • potatolicious 20 hours ago

    Yep, we are (unfortunately) still running on railroad infrastructure built a century ago. The amortization periods on that spending is ridiculously long.

    Effectively every single H100 in existence now will be e-waste in 5 years or less. Not exactly railroad infrastructure here, or even dark fiber.

    • 9rx 19 hours ago

      > Yep, we are (unfortunately) still running on railroad infrastructure built a century ago.

      That which survived, at least. A whole lot of rail infrastructure was not viable and soon became waste of its own. There was, at one time, ten rail lines around my parts, operated by six different railway companies. Only one of them remains fully intact to this day. One other line retained a short section that is still standing, which is now being used for car storage, but was mostly dismantled. The rest are completely gone.

      When we look back in 100 years, the total amortization cost for the "winner" won't look so bad. The “picks and axes” (i.e. H100s) that soon wore down, but were needed to build the grander vision won't even be a second thought in hindsight.

      • palmotea 18 hours ago

        > That which survived, at least. A whole lot of rail infrastructure was not viable and soon became waste of its own. There was, at one time, ten rail lines around my parts, operated by six different railway companies. Only one of them remains fully intact to this day. One other line retained a short section that is still standing, which is now being used for car storage, but was mostly dismantled. The rest are completely gone.

        How long did it take for 9 out of 10 of those rail lines to become nonviable? If they lasted (say) 50 years instead of 100, because that much rail capacity was (say) obsoleted by the advent of cars and trucks, that's still pretty good.

      • lesuorac 19 hours ago

        If 1/10 investment lasts 100 years that seems pretty good to me. Plus I'd bet a lot of the 9/10 of that investment had a lot of the material cost re-coup'd when scrapping the steel. I don't think you're going to recoup a lot of money from the H100s.

    • hyperbovine 16 hours ago

      > Effectively every single H100 in existence now will be e-waste in 5 years or less.

      This is definitely not true, the A100 came out just over 5 years ago and still goes for low five figures used on eBay.

    • fooker 19 hours ago

      > Effectively every single H100 in existence now will be e-waste in 5 years or less.

      This remains to be seen. H100 is 3 years old now, and is still the workhorse of all the major AI shops. When there's something that is obviously better for training, these are still going to be used for inference.

      If what you say is true, you could find a A100 for cheap/free right now. But check out the prices.

      • fxtentacle 18 hours ago

        Yeah, I can rent an A100 server for roughly the same price as what the electricity would cost me.

    • Spooky23 17 hours ago

      How was your trip down the third Avenue El? Did your goods arrive via boxcar to 111 8th Ave?

      • selimthegrim 14 hours ago

        At the rate they are throwing obstacles at the promised subway which they got rid of the 3rd Ave El for maybe his/her grandkids will finish the trip.

    • SJC_Hacker 19 hours ago

      > Yep, we are (unfortunately) still running on railroad infrastructure built a century ago. The amortization periods on that spending is ridiculously long.

      Are we? I was under the impression that the tracks degraded due to stresses like heat/rain/etc. and had to be replaced periodically.

      • ralph84 18 hours ago

        The track bed, rails, and ties will have been replaced many times by now. But the really expensive work was clearing the right of way and the associated bridges, tunnels, etc.

        • intrasight 14 hours ago

          I am really digging the railroad analogies in this discussion! There are some striking similarities if you do the right mappings and timeframe transformations.

          I am an avid rail-to-trail cycler and more recently a student of the history of the rail industry. The result was my realization that the ultimate benefit to society and to me personally is the existence of these amazing outdoor recreation venues. Here in Western PA we have many hundreds of miles of rail-to-trail. My recent realization is that it would be totally impossible for our modern society to create these trails today. They were built with blood, sweat, tears and much dynamite - and not a single thought towards environmental impact studies. I estimate that only ten percent of the rail lines built around here are still used for rail. Another ten percent have become recreational trails. That percent continues to rise as more abandoned rail lines transition to recreational use. Here in Western PA we add a couple dozen miles every year.

          After reading this very interesting discussion, I've come to believe that the AI arms race is mainly just transferring capital into the pockets of the tool vendors - just as was the case with the railroads. The NVidia chips will be amortized over 10 years and the models over perhaps 2 years. Neither has any lasting value. So the analogy to rail is things like dynamite and rolling stock. What in AI will maintain value? I think the data center physical plants, power plants and transmission networks will maintain their value longer. I think the exabytes of training data will maintain their value even longer.

          What will become the equivalent of rail-to-trail? I doubt that any of the laborers or capitalists building rail lines had foreseen that their ultimate value to society would be that people like me could enjoy a bike ride. What are the now unforeseen long-term benefit to society of this AI investment boom?

          Rail consolidated over 100 years into just a handful of firms in North America, and my understanding is that these firms are well-run and fairly profitable. I expect a much more rapid shakeout and consolidation to happen in AI. And I'm putting my money on the winners being Apple first and Google second.

          Another analogy I just thought of - the question of will the AI models eventually run on big-iron or in ballpoint pens. It is similar to the dichotomy of large-scale vs miniaturized nuclear power sources in Asimov's Foundation series (a core and memorable theme of the book that I haven't seen in the TV series).

  • mcswell 15 hours ago

    "...all the best compute in 2025 will be lacklustre in 2027": How does the compute (I assume you mean on PCs) of 2025 compare with the compute of 2023?

    Oh wait, the computer I'm typing this on was manufactured in 2020...

    • brianwawok 15 hours ago

      Neato. How’s that 1999 era laptop? Because 25 year old trains are still running and 25 year old train track is still almost new. It’s not the same and you know it.

      • 1oooqooq 14 hours ago

        last month HN was talking about a win95 with floppy drivers handling rail in Germany no less

  • Spooky23 17 hours ago

    Unlike 1875, we have Saudi and other tillion/billionaires willing commit almost any amount to own the future of business.

    • rchaud 16 hours ago

      Except they behave less like shrewd investors and more like bandwagon jumpers looking to buy influence or get rich quick. Crypto, Twitter, ridesharing, office sharing and now AI. None of these have been the future of business.

      Business looks a lot like what it has throughout history. Building physical transport infrastructure, trade links, improving agricultural and manufacturing productivity and investing in military advancements. In the latter respect, countries like Turkey and Iran are decades ahead of Saudi in terms of building internal security capacity with drone tech for example.

      • Spooky23 16 hours ago

        Agreed - I don’t think they are particularly brilliant as a category. Hereditary kleptocracy has limits.

        But… I don’t think there’s an example in modern history of the this much capital moving around based on whim.

        The “bet on red” mentality has produced some odd leaders with absolute authority in their domain. One of the most influential figures on the US government claims to believe that he is saving society from the antichrist. Another thinks he’s the protagonist in a sci-fi novel.

        We have the madness of monarchy with modern weapons and power. Yikes.

  • Analemma_ 20 hours ago

    Exactly: when was the last time you used ChatGPT-3.5? Its value deprecated to zero after, what, two-and-a-half years? (And the Nvidia chips used to train it have barely retained any value either)

    The financials here are so ugly: you have to light truckloads of money on fire forever just to jog in place.

    • falcor84 19 hours ago

      I would think that it's more like a general codebase - even if after 2.5 years, 95% percent of the lines were rewritten, and even if the whole thing was rewritten in a different language, there is no point in time at which its value diminished, as you arguably couldn't have built the new version without all the knowledge (and institutional knowledge) from the older version.

      • spwa4 19 hours ago

        I rejoined an previous employer of mine, someone everyone here knows ... and I found that half their networking equipment is still being maintained by code I wrote in 2012-2014. It has not been rewritten. Hell, I rewrote a few parts that badly needed it despite joining another part of the company.

    • tim333 17 hours ago

      OpenAI is now valued at $500bn though. I doubt the investors are too wrecked yet.

      It may be like looking at the early Google and saying they are spending loads on compute and haven't even figured how to monetize search, the investors are doomed.

      • oblio 12 hours ago

        Google was founded in 1998 and IPOed in 2004. If OpenAI was feeling confident they'd find ways to set up a company and IPO, 9 years after founding. It's all mostly fictional money at this point.

    • CompoundEyes 15 hours ago

      A really did few days ago gpt-3.5-fast is a great model for certain tasks and cost wise via the API. Lots of solutions being built on the today’s latest are for tomorrow’s legacy model — if it works just pin the version.

    • fooker 19 hours ago

      > And the Nvidia chips used to train it have barely retained any value either

      Oh, I'd love to get a cheap H100! Where can I find one? You'll find it costs almost as much used as it's new.

    • cj 19 hours ago

      > money on fire forever just to jog in place.

      Why?

      I don't see why these companies can't just stop training at some point. Unless you're saying the cost of inference is unsustainable?

      I can envision a future where ChatGPT stops getting new SOTA models, and all future models are built for enterprise or people willing to pay a lot of money for high ROI use cases.

      We don't need better models for the vast majority of chats taking place today E.g. kids using it for help with homework - are today's models really not good enough?

      • MontyCarloHall 14 hours ago

        >I don't see why these companies can't just stop training at some point.

        Because training isn't just about making brand new models with better capabilities, it's also about updating old models to stay current with new information. Even the most sophisticated present-day model with a knowledge cutoff date of 2025 would be severely crippled by 2027 and utterly useless by 2030.

        Unless there is some breakthrough that lets existing models cheaply incrementally update their weights to add new information, I don't see any way around this.

      • Eisenstein 19 hours ago

        They aren't. They are obsequious. This is much worse than it seems at first glance, and you can tell it is a big deal because a lot of effort going into training the new models is to mitigate it.

    • mattmanser 20 hours ago

      But is it a bit like a game of musical chairs?

      At some point the AI becomes good enough, and if you're not sitting in a chair at the time, you're not going to be the next Google.

      • potatolicious 19 hours ago

        Not necessarily? That assumes that the first "good enough" model is a defensible moat - i.e., the first ones to get there becomes the sole purveyors of the Good AI.

        In practice that hasn't borne out. You can download and run open weight models now that are spitting distance to state-of-the-art, and open weight models are at best a few months behind the proprietary stuff.

        And even within the realm of proprietary models no player can maintain a lead. Any advances are rapidly matched by the other players.

        More likely at some point the AI becomes "good enough"... and every single player will also get a "good enough" AI shortly thereafter. There doesn't seem like there's a scenario where any player can afford to stop setting cash on fire and start making money.

        • wood_spirit 5 hours ago

          Perhaps the first thing the owners ask the first true AGI is “how do I dominate the world?” and the AGI outlines how to stop any competitor getting AGI..?

conartist6 16 hours ago

It's not that the investments just won't pay off, it's that the global markets are likely to crash like happened with the subprime mortgage crisis.

  • vitaflo 16 hours ago

    This is much closer to the dotcom boom than the subprime stuff. The dotcom boom/bust affected tech more than anything else. It didn’t involve consumers like the housing crash did.

    • bobxmax 16 hours ago

      The dot com boom involved silly things like Pets.com IPOing pre-revenue. Claude code hit $500m in ARR in 3 months.

      The fact people don't see the difference between the two is unreal. Hacker news has gone full r* around this topic, you find better nuance even on Reddit than here.

      • mcintyre1994 8 hours ago

        Do you mean pre-profit/without ever making a profit? I found an article about their IPO:

        > Pets.com lost $42.4 million during the fourth quarter last year on $5.2 million in sales. Since the company's inception in February of last year, it has lost $61.8 million on $5.8 million in sales.

        https://www.cnet.com/tech/tech-industry/pets-com-raises-82-5...

        They had sales, they were just making a massive loss. Isn’t that pretty similar to AI companies, just on a way smaller scale?

        We haven’t seen AI IPOs yet, but it’s not hard to imagine one of them going public before making profit IMO.

        • bobxmax 3 hours ago

          You'd think after all this time nerds would stop obsessing about profit. Profit doesn't matter. It hasn't mattered for a long time because tech companies have such fat margins they can go profitable in months if they wanted to.

          Yes, $5m in sales. That's effectively pre-revenue for a tech company.

      • conartist6 3 hours ago

        What you're missing is how that value comes about. People seem to think it's an infinite fountain but it's more like strip mining the commons.

        We also know that AI hype is holding up most of the stock market this point, including the ticker symbols which you don't think of as being for "AI companies". Market optimism at large is coming from the idea that companies won't need employees soon, or that they can keep using AI to de-leverage and de-skill their workforce

        • bobxmax 3 hours ago

          So that $500m in ARR in 3 months is from hype? That's what you're contending?

      • lelandbatey 16 hours ago

        They're not claiming that it's like the dot com boom because no one is actually making money. They're claiming that this is more like the dot com boom than the housing bubble, which I think is true. The dot com crash didn't cause Jane-on-the-street to lose her house while she worked a factory job, though the housing crisis did have those kinds of consumer-affecting outcomes.

      • oblio 12 hours ago

        1. Claude Code is claimed to have hit €500m ARR in 3 months.

        2. What is the Claude Code profit for the same period?

        3. What is the Claude Code profit per request served when excluding fixed expenses such as training the models?

      • jrflowers 15 hours ago

        You have a good point. Pets.com would have fared much better if investors gave them several billion dollars in 1998, 1999 and then again in 2000

    • CodingJeebus 5 hours ago

      We are starting to see larger economic exposure to AI.

      Banks are handing out huge loans to the neocloud companies that are being collateralized with GPUs. These loans could easily go south if the bottom falls out of the GPU market. Hopefully it’s a very small amount of liquidity tied up in those loans.

      Tech stocks make up a significant part of the stock market now. Where the tech stocks go, the market will follow. Everyday consumers invested in index funds will definitely see a hit to their portfolios if AI busts.

    • digdugdirk 16 hours ago

      But it does involve a ton of commercial real estate investment, as well as a huge shakeup in the energy market. People may not lose their homes, but we'll all be paying for this one way or another.

  • mothballed 16 hours ago

    The fed could still push the real value of stocks quite a bit by destroying the USD, if they want, by pinning interest rates near 0 and forcing a rush to the exits to buy stock and other asset classes.

    • mcny 16 hours ago

      The point still stands though. All these other companies can pivot to some thing else if AI fails but what will OpenAI do?

      • rubyfan 16 hours ago

        By the time it catches up with them they will have IPO’d and dumped their problem onto the public market. The administration will probably get a golden share and they will get a bail out in an effort to soften the landing for their campaign donors that also have huge positions. All the rich people will be made whole and the US tax payer will pay the price of the bail out.

        And Microsoft or whoever will absorb the remains of their technology.

      • rglover 16 hours ago

        Sell to Microsoft and be absorbed there (and Anthropic to Amazon).

      • mandeepj 16 hours ago

        > but what will OpenAI do?

        Will get acquired at “Store Closing” price!!

JCM9 20 hours ago

Businesses are different but the fundamentals of business and finance stay consistent. In every bubble that reality is unavoidable, no matter how much people say/wish “but this time is different.”

redwood 16 hours ago

I'm reminded of the quote "If you owe the bank $100 that's your problem. If you owe the bank $100 million, that's the bank's problem." - J. Paul Getty

Nvidia may well be at the mercy of them! Hence the recent circular dealing

bee_rider 15 hours ago

The past/present company they remind me of the most is semiconductor fabs. Significant generation-to-generation R&D investment, significant hardware and infrastructure investment, quite winner-takes-all on the high end, obsoleted in a couple years at most.

The main differences are these models are early in their development curve so the jumps are much bigger, and they are entirely digital so they get “shipped” much faster, and open weights seem to be possible. None of those factors seem to make it a more attractive business to be in.

01100011 12 hours ago

The one thing smaller companies might have is allocated power budgets from power companies. Part of the mad dash to build datacenters right now is just to claim the power so your competitors can't. Now I do think the established players hold an edge here, but I don't think OpenAI/Anthropic/etc are without some bargaining power(hah).

LarsDu88 17 hours ago

If you build the actual datacenter, less than half the cost is the actual compute. The other half is the actual datacenter infrastructure, power infrastructure, and cooling.

So in that sense it's not that much different from Meta and Google which also used server infrastructure that depreciated over time. The difference is that I believe Meta and Google made money hand over fist even in their earliest days.

  • Lalo-ATX 15 hours ago

    Last time i ran the numbers -

    Data center facilities are ~$10k per kW

    IT gear is like $20k-$50k per kW

    Data center gear is good for 15-30 years. IT is like 2-6ish.

    Would love to see updated numbers. Got any?

EasyMark 13 hours ago

In the end Revenues > Costs or you have an issue. That "startup" money will eventually be gone, and you're back to MIMO Money In vs Money Out and if it's not > , you will go bankrupt.

lossolo 19 hours ago

The funniest thing about all this is that the biggest difference between LLMs from Anthropic, Google, OpenAI, Alibaba is not model architecture or training objectives, which are broadly similar but it's the dataset. What people don't realize is how much of that data comes from massive undisclosed scrapes + synthetic data + countless hours of expert feedback shaping the models. As methodologies converge, the performance gap between these systems is already narrowing and will continue to diminish over time.

yieldcrv 20 hours ago

Just because they have ongoing costs after purchasing them doesn't mean it's different than something else we've seen? What are you trying to articulate exactly, this is a simple business and can get costs under control eventually, or not