Comment by ajb

Comment by ajb a day ago

58 replies

Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here: https://storagesearch.com/memory-boom-bust-cycles.html

(That's just me eyeballing it, feel free to do the math)

ksec 12 hours ago

I am so glad both top rated and majority of comments on HN finally understands DRAM industry instead of constant DRAM is a cartel that is why things are expensive.

Also worth mentioning DRAM and NAND's profit from Samsung is what keep the Samsung Foundry fighting TSMC. Especially for those who thinks TSMC is somehow a monopoly.

Another things to point out which is not mentioned yet, China is working on both DRAM and NAND. Both LPDDR5 and Stacked NAND are already in production and waiting for yield and scale. Higher Price will finally be perfect timing for them to join the commodity DRAM and NAND race. Good for consumer I suppose, not so good for a lot of other things which I wont go into.

  • ls612 10 hours ago

    DRAM manufacturers have literally been convicted of price fixing in the past why do you have to white knight for them?

    • ksec 31 minutes ago

      And I am 100% sure a lot of other industries in commodities would have been convicted of price fixing if we look into it. And I say this as someone who have witnessed it first hand.

      Unfortunately commodity business is not sexy, it doesn't get the press, nor does it get told even in business schools. But a lot of the times these call called price fixing is a natural phenomenon.

      I wont even go into what get decided in court doesn't always mean it is right.

      I will also add we absolutely want the DRAM and NAND or in fact any industries to make profits, or as much profits as it could. What is far more important is where do they spend not those profits. I didn't look into SK Hynix but both Samsung and Micron spends significant amount of R&D at least try to lower the total production cost of DRAM per GB. We want them to make healthy margin selling DRAM at $1/GB, not losing money and then go bankrupt.

    • kbolino 7 hours ago

      Both stories can be true.

      The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry, and that strategy requires significant capital expenditures, and the industry does not get (or does not want) outside investment to fund it, and if any of the firms defects and keeps prices low the others cannot execute on the strategy, so they all agree to raise prices.

      Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal, and they have to get smacked down by regulators.

      • vee-kay 6 hours ago

        > The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry.. Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal..

        So said and did the infamous Phoebus cartel, to unnaturally "fix" the prices and quality of light bulbs.

        https://spectrum.ieee.org/the-great-lightbulb-conspiracy

        https://en.wikipedia.org/wiki/Phoebus_cartel

        For more than a century, one strange mystery has puzzled the world: why do old light bulbs last for decades while modern bulbs barely survive a couple of years?

        The answer lies in a secret meeting held in Geneva, Switzerland in 1924, where the world’s biggest light bulb companies formed the notorious Phoebus Cartel.

        Their mission was simple but shocking: control the global market, set fixed prices, and most importantly… reduce bulb lifespan.

        Before this cartel, bulbs could easily run for 2500+ hours. But after the Phoebus Cartel pact and actions, all companies were forced to limit lifespan to just 1000 hours. More failure meant more purchases. More purchases meant more profit. Any company who refused faced heavy financial penalties.

        The most unbelievable proof is the world-famous Livermore Fire Station bulb in California, glowing since 1901. More than 120 years old. Still alive. While our new incandescent bulbs die in 1–2 years.

        Though the Phoebus cartel was dissolved in the 1930s due to government pressure, its impact still shadows modern manufacturing. Planned obsolescence didn’t just begin here… but Phoebus made it industrial.

        https://m.youtube.com/watch?v=0U5uU6nzgO8

      • Y_Y 7 hours ago

        > The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry

        As long as it doesn't fall into the "collusion" prohibitions of the relevant competition law.

        > “People of the same trade seldom meet … but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.”

        Adam Smith, The Wealth of Nations (1776)

    • hollerith 9 hours ago

      Most of us who've been on Earth for a while know that courts often get it wrong. Even if the particular court decision you mention was correct does not mean that price fixing is the main reason or the underlying reason DRAM prices sometime go up.

      • lazide 9 hours ago

        They blatantly were doing it, admitted to it, and did it again later. What kind of crazy is this?

        Is this the ‘but he loves me, he wouldn’t hit me again’ of the tech world?

    • [removed] 9 hours ago
      [deleted]
fullstop 10 hours ago

Historically, yes. But we haven't had historical demand for AI stuff before. What happens when OpenAI and NVIDIA monopolize the majority of DRAM output?

Yokolos 21 hours ago

I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.

https://www.tomshardware.com/pc-components/storage/perfect-s...

  • zozbot234 19 hours ago

    The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.

    • kees99 14 hours ago

      > produce more mature technology ... DDR3/4

      ...except current peak in demand is mostly driven by build-out of AI capacity.

      Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.

      • ifwinterco 14 hours ago

        Even if AI stuff does really need DDR5, there must be lots of other applications that would ideally use DDR5 but can make do with DDR3/4 if there's a big difference in price

        • shevy-java 13 hours ago

          I mean, AI is currently hyped, so the most natural and logical assumption is that AI drives these prices up primarily. We need compensation from those AI corporations. They cost us too much.

  • justin66 11 hours ago

    > The shortage could last a decade.

    Do we really think the current level of AI-driven data center demand will continue indefinitely? The world only needs so many pictures of bears wearing suits.

    • lukeschlather 10 hours ago

      The pop culture perception of AI just being image and text generators is incorrect. AI is many things, they all need tons of RAM. Google is rolling out self-driving taxis in more and more cities for instance.

      • justin66 10 hours ago

        Congrats on engaging with the facetious part of my comment, but I think the question still stands: do you think the current level of AI-driven data center demand will continue indefinitely?

        I feel like the question of how many computers are needed to steer a bunch of self-driving taxis probably has an answer, and I bet it's not anything even remotely close to what would justify a decade's worth of maximum investment in silicon for AI data centers, which is what we were talking about.

        • lazide 9 hours ago

          Data center AI is also completely uninteresting/non-useful for self driving Taxis, or any other self driving vehicle.

    • downrightmike 8 hours ago

      No, the 10% best scenario return on AI won't make it. The bubble is trying to replace all human labor, which is why it is a bubble in the first place. No one is being honest that AGI is not possible in this manner of tech. And Scale won't get them there.

  • snuxoll 19 hours ago

    There's not a difference between "consumer" DRAM and "enterprise" DRAM at the silicon level, they're cut from the same wafers at the end of the day.

    • david-gpu 17 hours ago

      Doesn't the same factory produce enterprise (i.e. ECC) and consumer (non-ECC) DRAM?

      If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.

      • crote 14 hours ago

        Conceptually, you can think of it as "RAID for memory".

        A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.

        An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.

        Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.

      • matthews3 16 hours ago

        At the silicon level, it is the same.

        Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.

        The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.

    • Yokolos 16 hours ago

      Yes, but if new capacity is also redirected to be able to be sold as enterprise memory, we won't see better supply for consumer memory. As long as margins are better and demand is higher for enterprise memory, the average consumer is screwed.

      • bobbob1921 8 hours ago

        Does it matter that AI hardware has such a shorter shelf life/faster upgrade cycle? Meaning we may see the ram chips resold/thrown back into the used market quicker than before?

      • immibis 16 hours ago

        Is there still a difference? I have DDR5 registered ECC in my computer.

        • Yokolos 16 hours ago

          I mean, the only difference we care about is how much of it is actual RAM vs HBM (to be used on GPUs) and how much it costs. We want it to be cheap. So yes, there's a difference if we're competing with enterprise customers for supply.

          I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.

  • dangus 13 hours ago

    Anytime somebody is making a prediction for the tech industry involving a decade timespan I pull out my Fedora of Doubt and tip my cap to m’lady.

  • rasz 19 hours ago

    A LOT of businesses learned during Covid they can make more money by permanently reducing output and jacking prices. We might be witnessing the end times of economies of scale.

    • Incipient 18 hours ago

      The idea is someone else comes in that's happy to eat their lunch by undercutting them. Unfortunately, we're probably limited to China doing that at this point as a lot of the existing players have literally been fined for price fixing before.

      https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal

      • autoexec 8 hours ago

        It seems more likely that someone else comes in and either colludes with the people who are screwing us to get a piece of the action or gets bought out by one of the big companies who started all this. Since the rare times companies get caught they only get weak slaps on the wrist where they only pay a fraction of what they made in profits (basically just the US demanding their cut) I don't have much faith things will improve any time soon.

        Even China has no reason to reduce prices much for memory sold to the US when they know we have no choice but to buy at the prices already set by the cartel. I expect that if China does start making memory they'll sell it cheap within China and export it at much higher prices. Maybe we'll get a black market for cheap DRAM smuggled out of China though.

    • PunchyHamster 2 hours ago

      In that case it's far simpler - even IF they wanted to met the demand, building more capacity is hideously expensive and takes years.

      So, it would happen even with best intentions and no conspiracies. AI boom already hiked GPU prices, memory was next in line.

    • trhway 15 hours ago

      I think in part it is a system level response to the widespread just-in-time approach of those businesses' clients. A just-in-time client is very "flexible" on price when supply is squeezed. After that back and forth i think we'll see return to some degree of supply buffering(warehousing) to dampen down the supply levels/price shocks in the pipelines.

      • CamperBob2 9 hours ago

        I thought that, too, but then the Nexperia shitstorm hit, and it was as if the industry had learned nothing at all from the COVID shortages.

addaon a day ago

Nothing costs trillions.

  • chmod775 21 hours ago

    If you had a trillion dollars you might find some things are for sale that otherwise wouldn't be...

    • nolok 17 hours ago

      To be fair, nobody HAS a trillion dollar either. They have stuff that may be worth a trillion dollar when sold.

crote 14 hours ago

Is this still the case in 2025, though?

In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.

But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.

Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...

  • lazide 13 hours ago

    Also, current political turbulence makes planning for the long term extremely risky.

    Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?

    The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.

    High risk requires a correspondingly high potential return.

    That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.

    But people do need the current production.

darkwater 13 hours ago

My guess is that they will plummet down when the AI bubble bursts.

jbverschoor a day ago

A waiver is a waiver. The cost is per square mm. It’s pure supply and demand

  • chmod775 21 hours ago

    No, a wafer is very much not a wafer. DRAM processes are very different from making logic*. You don't just make memory in your fab today and logic tomorrow. But even when you stay in your lane, the industry operates on very long cycles and needs scale to function at any reasonable price at all. You don't just dust off your backyard fab to make the odd bit of memory whenever it is convenient.

    Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.

    * Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.

    • jbverschoor 7 hours ago

      Sure, yes the cost of producing a wafer is fixed. Opex didn’t change that much.

      Following your reasoning, which is common in manufacturing, the capex needed is already allocated. So, where does the 2x price hike come from if not supply/demand?

      The cost to produce did not go up 100%, or even 20%

      Actually, DRAM fabs do get scaled down, very similar to the Middle East scaling down oil production.

      • chmod775 4 hours ago

        > So, where does the 2x price hike come from if not supply/demand?

        It absolutely is supply/demand. Well, mostly demand, since supply is essentially fixed over shorter time spans. My point is that "cost per square mm [of wafer]" is too much of a simplification, given that it depends mostly on the specific production line and also ignores a lot of the stuff going on down the line. You can use to look at one fab making one specific product in isolation, but it's completely useless to compare between them or when looking at the entire industry.

        It's a bit like saying the cost of cars is per gram of metal used. Sure, you can come up with some number, but what is it really useful for?

      • zozbot234 7 hours ago

        DRAM/flash fab investment probably did get scaled down due to the formerly low prices, but once you do have a fab it makes sense to have it produce flat out. Then that chunk of potential production gets allocated into DRAM vs. HBM, various sorts of flash storage etc. But there's just no way around the fact that capacity is always going to be bottlenecked somehow, and a lot less likely to expand when margins are expected to be lower.

    • incrudible 16 hours ago

      > Sometimes they're only built if their capacity is essentially sold already.

      "Hyperscalers" already have multi-year contracts going. If the demand really was there, they could make it happen. Now it seems more like they're taking capacity from what would've been sold on the spot or quarterly markets. They already made their money.

  • [removed] 21 hours ago
    [deleted]