Samsung's 60% DRAM price hike signals a new phase of global memory tightening
(buysellram.com)442 points by redohmy 8 days ago
442 points by redohmy 8 days ago
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
We need better antitrust and anti-monopoly enforcement. Break up the biggest companies, and then they'll have to actually participate in markets.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
Sadly natural result of industries where economies of scale and price of entry make anyone not massive uncompetitive.
I don't think there is even a good solution for that. Govt could essentially sponsor some competition but that's easy to go from "helping to market" to "handouts for incompetent"
Diversity is good for populations. If you have a tiny pool of individuals with mostly the same traits (in this case I mean things like culture, education, morality, ethics, rather than class and race - though there are obvious correlations) then you get what some other comments are describing as being effectively centralized planning with extra steps, rather than a market of competing ideas.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
Exactly. So instead of electing the people who will allocate the resources, the people who are successful in one thing are given the right to manage the resources for whatever they wish and they can keep being very wrong for very long time when other people are deprived from the resources due to the mismanagement and can't do anything about it.
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
It's centralized vs. decentralized not public vs. private. A centralized private planning committee is still centralized.
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
You'll just get a different form of power concentration. Do you think the Soviet Union didn't have power concentration in individuals? Of course it did, that's why the general secretary of the party was more important than the actual heads of state and government.
> But the power concentration is a strong reason.
A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power than the one you're fighting against. You're proposing to cure the common cold with AIDS.
>It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
Every corporation is a (not so) little pocket of centrally planned economy.
The only saving grace is that it can die and others will scoop up released resources.
When country level planned economy dies, people die and resources get destroyed.
> The only saving grace is that it can die and others will scoop up released resources.
Ideally. Realistically in market with only few companies around it makes it even less competitive.
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
I disagree.
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
>AI is the first industrial breakthrough in a century
Is it?
Why not follow the time-honoured approach and put the data centres in low-income countries?
The British deindustrialized India.
I assume they don't have good enough power infrastructure.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
Usually companies run out of resources before they screw up global prices in massive markets.
If it was a couple billion dollars of memory purchasing nobody would care.
> Usually companies run out of resources before they screw up global prices in massive markets.
It happens more often than you might expect.
The Onion Futures Act and what led to it is always a fun read: https://en.wikipedia.org/wiki/Onion_Futures_Act
> This is part of how free markets self correct, misallocate resources and you run out of resources.
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
A tax on scale.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
>society is better served by having it produced by the few small companies than the one big company.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
Unless I get all the resources I want, when I want, all at low prices, the market has obviously failed.
Yes, except unironically. A market that cannot efficiently serve the vast majority of the population is a failed market.
Gamers at least enjoy their GPUs and memory.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
What do you think happens when the majority of consumers are priced not only out of bread, but also circuses?
There is a reason why there used to be market regulation and breaking up of monopolies. We are now-a-days trying out changes to the stable state from centuries, because that would be so yesterday, and will soon find out, why that state was chosen in the first place.
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
> where resources can be massively misallocated
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
It is the market working as expected, but it still failed to allocate money diversely.
OpenAI appears to have bought the DRAM, not to use it, as they are apparently buying it in unfinished form, but explicitly to take it off the market and cause this massive price increase & squash competition.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Going to be awesome tho when OpenAI et al fail because the market is going to be flooded with cheap parts.
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
Tulips also grew and could be bred.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
The market failure results from those people having way more money than logic and economic principles dictate they should. A person would normally have to make a lot of good decisions in a row to get that much money, and would be expected continue making good decisions, but also wouldn't live long enough to reach these extreme amounts. However, repeated misallocation by the federal government over the last several decades (i.e. excessive money printing) resulted in people getting repeatedly rewarded for making the right kind of bad economic decisions instead.
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
At $DAYJOB, we have had confirmed and paid for orders be cancelled within the last week due to price hikes. One DDR5 server configuration went from ~$13k to near $25k USD in a matter of days.
We also were looking for DDR4 memory for some older machines and that has shot up 2x as well.
Hate this AI timeline.
It's wild. I bought 64GB (2x32) DDR4 SODIMM (CT2K32G4SFD832A) for $100 this April. Cheapest I can find it today is $270.
I picked up 32GB (2x16GB) DDR4 (CMK32GX4M2E3200C16) last September for $55. Now it's $155.
Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.
> Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
> Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different.
All you need is a fixed-latency, dumb translator bridge where the adapter forces everything into a simplified JEDEC-compliant mode.
CA/CK Line Translator with a Fixed Retimer as the biggest mismatch between RDIMM/UDIMM is the command/address path.
RDIMMs route CA/CK to RCD to DRAM, and the UDIMMs route CA/CK to DRAM directly, take the UDIMM CA/CK, delay + buffer + level shift it, feed it into a "RCD" like input using a delay locked loops (DLL).
Throw in a SPD translator, PMIC and voltage correction, DQ line conditioning and some other stuff into a 10–12-layer PCB, retimer chips, vrm, and level shifters.
It would cost about $40 million to fab and about $100 per adapter but would make bank with all the spare UDIMMs when the bubble bursts.
I imagine the cost is primarily in the actual DRAM chips on the DIMM. So availability of RDIMMs on the market will affect DRAM prices anyway. These days lots of motherboards come with Oculink, etc. and you can get a U.2 PCIe card for rather cheap.
I put together a small server with mostly commodity parts.
I see it a bit differently. In marketing, companies like AppLovin with the Axon Engine and Zeta Global with Athena are already showing strong profitability, both in earnings and free cash flow. They’re also delivering noticeably higher returns on ad spend compared to pre-AI tools for their customers. This is the area I’m researching most closely, so I can only speak for marketing, but I’d love to hear from others seeing similar results in their industries.
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs suck for gaming, I have seen a video from a guy playing Red Dead Redemption 2 on a H100 at a whooping 8 FPS! That's after some hacks, because otherwise it wouldn't run at all.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
So there aren't actually GPUs, but more like some other architecture for CPUs.
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!
But anyway, the trick is to run it in the winter and keep your house warm.
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.
I'm especially annoyed that this is most likely intentional.
(Not at all)openAI saw they are getting behind their competitors (gpt 5 and 5.1 were progressively worse for my use case - actual problem solving and tweaking existing scripts) are getting better. (Claude and sonnet were miles ahead and I used gpt only due to lower price). Now not only open weights models like Qwen3 and kimik2 exceeded their capability and you can run them at home if you have the hardware or for peanuts on a variety of providers. Cheap-er hardware like strix halo (and Nvidia dgx) made 128gb vram achievable to enthusiast. And Google is eating their punch with Gemini.
All while their CFO starts talking about government bailing them out from spending they cannot possibly fund.
Of course they will attempt to blow up the entire hardware market so if they AI flops they will be able to at least re not you hardware like AWS.
Of course they
Small correction, Strix has at most 96GB available for GPU. But that's still a plenty
But yeah, both AMD and Intel are also pushing NPU builtin into the higher offerings so there is a very good chance that a good portion of AI will be happening closer and closer to users
It’s telling that the comment is highly upvoted (as I write this comment, anyway) despite being incoherent and incomplete in multiple places. I guess being generically angry and complaining about popular targets like OpenAI is an easy way to earn upvotes from visitors who don’t actually read, just scan comments for keywords and vibes.
Somebody do the math on when we will reliably start running out of Grid Power. Than only this "AI builout" will slow down. Manufactiring generators is boring, and very less invested than manufacturing AI servers.
I wish I saw more people realizing the build out promising being made by the big AI are physically impossible
not only is it impossible to build that much power generation on those timelines
it's also not possible to build enough GPUs to fill a purported tripling of US datacenter capacity
what's the ROI on giant empty warehouses full of empty server racks and no electricity?
That's why its increasingly more important to find answers how to build these models that work sustainably. The approach of training with HUGE amount of data requiring HUGE infra seems to have blinded the hype-bros that they are not planning to innovate to do it in a small-scale.
Also the transformers. (The big magnetic ones for voltage convetion)
At least if the robots in disguise turned up, we'd answer the AGI question
Wild experience building a PC today and discovering the prices are less competitive with Macs than they’ve always been. Building a well-appointed gaming/production/CAD rig is suddenly very expensive between RAM, GPU, and nvme prices being so high.
I can‘t imagine Apple doesn‘t have capacity booked well in advance, and their suppliers aren‘t going to stiff them because they‘d lose those long-term contracts. Sure, if the shortage lasts a year or more, there‘ll be issues, but if it‘s short term they might be fine.
This problem is so much worse when you look at server mobo configurations that basically jumped from 8 to 12 slots. Meaning you need 50% more sticks to saturate versus Epyc 7003/2. I was hoping to build a Genoa-X server for compute and ram cost just went bonkers (that’s on a nearly 2-yo system). I decided to shelve that idea.
We've been getting increasingly fucked for years on housing prices, healthcare, food, live entertainment, etc. Consumer electronics were one of the few areas that you could at least argue you were getting more value per dollar each year. GPU's have been a mess for awhile now but now it seems like it's just going to be everything.
So glad I bought 128gb ddr5 for my desktop a year ago... I usually don't need it all but it was cheap at the time. Most I use it for is cpu offloading for LLMs too big for my 3090 and for running 10 or so small VMs for my projects.
Apple ram prices suddenly started to look a little reasonable.
Who am I kidding, but the such a high increase means these changes are here to stay, it’s not a progressive change at all.
I mentioned this previously in a thread about node sizes and got down voted for it, but I stand by my opinion: the rest of the world ie normal people need China to become competitive in chip manufacturing.
Without that competition, everyday consumers are going to get priced out of the market by major corporations. We have reached a point in CPU technology where newer tech is no longer automatically cheaper and faster to make; therefore, we need more competition to keep prices down.
Except a lot of other people don't live in countries that are allied with Chinas vision. We need more competition among our allies, not have the only competition from our enemy.
Related: https://news.ycombinator.com/item?id=46012710 (from 2024)
> hbm chips are now emerging as another bottleneck in the development of those models. Both sk Hynix and Micron, an American chipmaker, have already pre-sold most of their hbm production for next year. Both are pouring billions of dollars into expanding capacity, but that will take time. Meanwhile Samsung, which manufactures 35% of the world’s hbm chips, has been plagued by production issues and reportedly plans to cut its output of the chips next year by a tenth.
Companies that invested in CXL got their money's worth. CXL is basically older RAM connected over PCI-e. Not only you're not throwing away RAM which cannot be used with the current generation of motherboard and chipsets, but you also have a way to get a lot of slower memory for applications that don't need the best and the newest.
If we're going to see retailers price-gouging on DDR5, maybe people will be willing to buy slightly older gear with DDR4 (and corresponding motherboard and CPU).
Especially for systems for which the workloads are actually bound by GPU compute, network, or storage.
I just snagged an Asrock Rack mobo (X570), 5900x and 128gb ecc ddr4 for $680. Felt like a steal with how memory prices are going these days, ECC to boot.
The Reuters article referenced: https://www.reuters.com/world/china/samsung-hikes-memory-chi...
It will. The big 3 (Micron, Samsung, SK Hynix) have announced they will quit making LPDDR4 so that they can make more (LP)DDR5.
Even if production capacity wasn't shared/shifting to the higher end products (which it seemingly is), there's certainly going to be an increase in demand for DDR4 as it acts as a substitute good. Prices are already up significantly.
Gamers Nexus is reporting increasing DDR4 prices, but it’s unclear to what extent it’s driven by the DDR5 market. DDR4 production is expected to be slowing anyway given the move to DDR5.
> Just how sure are we the AI bubble is the entire reason for these absurd prices?
We're not, and market dictates that they don't have to talk to know to jack up the prices.
This ram price spike is leading Nvidia reporting for this quarter: gross margins were 70 percent. It's looking like their year over year increase in margins (double) is not because it came anywhere close to shipping double the number of units.
Meanwhile if you look at Micron their gross margin was 41% for fiscal year 2025, and 2024 looks to be 24%.
Micron and its peers, are competing with Nvidia for shareholder dollars (the CEO's real customer). Them jacking up prices is because there is enough of the market, dumb enough, to bear it right this second. And every CEO has to be looking at those numbers and thinking the same things: "Where is my cut of the pie, why aren't we at 60 percent".
We're now at a point where hardware costs are going to inhibit development. Everyone short of the biggest players are now locked out, and thats not sustainable. Of the AI ventures there is only one that seems to have a reasonable product, and possibly reasonable financials. Many of the other players are likely going to be able to weather the write downs.
The music will stop, the question is when.
Maybe it was a bad idea for Intel to sell off their memory unit
Just about time that I'm finally contemplating upgrading my 13 y.o. laptop >:E
So first it was bitcoin/crypto, now it's ai. pc gaming is dead at this point. i wonder if it will force developers to care about doing more with less hardware and optimize now.
>Samsung and its peers, SK hynix and Micron Technology, have redirected much of their fabrication capacity to high-end chips used in AI servers. While this shift yields higher margins, it leaves less capacity for traditional DRAM products that power laptops, desktops, and mainstream servers.
So if the AI bubble does pop in early 2026, you will get a tsunami of cheap server RAM. You still won't be able to find cheap PC RAM. So either way, the short term future of computing is firmly fixed in the cloud.
If AI bubble exploded today, we'd probably still see it at that level for year.
If it doesn't, expect years, till enough new capacity will be build
I had a simple proxmox/k8s cluster going, and fitting RAM for nodes was the last on my list. It was cheapo ol' DDR4.
Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.
> I'd rather slow down my spending
And that will result in even more resources being allocated into the "big spenders". We are for a long time, in a death spiral for the whole PC field. If it was not crypto mining (multiple times), then it was HDD mining, then it was pandemic, and now its AI.
What used to be a stable market, that was predictable, has become ultra expensive. And now the whole SSD / DDR pricing are going to hurt even more.
Worst of all is, that a lot of resources are now going to enterprise hardware. So even if the AI bubble goes down, its not like the market will be flooded with cheap NVMEs or cheaper DDR sticks, as that production will have gone into 2.5" U.3 drives and LPDDR memory or the likes.
Thanks "Open""AI", Trump for making us pay for the "AI" infrastructure. In the original deal "Open""AI" claimed Samsung would scale up production:
https://openai.com/index/samsung-and-sk-join-stargate
The Samsung announcement contains no reference to scaling up production:
https://news.samsung.com/ca/samsung-and-openai-announce-stra...
Semiconductor companies have been bitten in the past by scaling up production into a bubble, so of course Samsung just raises prices. When you buy DRAM, remember that you are financing oligarchs and that Stargate has lied yet again.
So AI drives prices up, directly and indirectly. I am not happy, in part because I actually need to purchase new hardware (eventually, and unfortunately quite soon, probably next year already, again).
I think there must be a tax of all those AI corporations - they cost us as society WAY too much. We need to bring this into the discussion; right now lobbyists such as the orange king want to ban all discussions therein aka making AI investments exempt from numerous things. This is leeching on the general taxpayers, in all countries. It is not acceptable.
>> I am not happy ... I think there must be a tax of all those AI corporations
There are a lot of elements to this AI shit-show that I don't like or worry about, but taxing them specifically because they're driving up the price of memory when you want some is not really a "societal cost". You then mention something about "general taxpayers" - didn't you just lobby to make them a super tax payer? Go ahead and rant, but seems like pretty basic supply & demand, and keep some perspective; it's computer memory not bread.
I pre-ordered and picked up a framework desktop with 128GB of DDR5-8000 inside of it. This is the type of system that is the a indirect byproduct of the change towards AI - it may not have been what AM was originally intending with the AI Max 395+ line - but it definitly is the kind of optimized thinking that will drive AI into the hands of consumers.
That's part of the reason I think this boom-bust cycle might be a bit different. Hopefully, Intel can use some of its capacity that they have coming up in the foundry to service this need.
> This is the type of system that is the a indirect byproduct of the change towards AI - it may not have been what AM was originally intending with the AI Max 395+ line - but it definitly is the kind of optimized thinking that will drive AI into the hands of consumers.
it literally was intended for exactly that, it has AI in the name of the cpu, and it was from the get go targeted at AI and GPU heavy workloads (3D rendering etc)
All I can say is,
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.