Samsung's 60% DRAM price hike signals a new phase of global memory tightening
(buysellram.com)436 points by redohmy 8 days ago
436 points by redohmy 8 days ago
Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here: https://storagesearch.com/memory-boom-bust-cycles.html
(That's just me eyeballing it, feel free to do the math)
I am so glad both top rated and majority of comments on HN finally understands DRAM industry instead of constant DRAM is a cartel that is why things are expensive.
Also worth mentioning DRAM and NAND's profit from Samsung is what keep the Samsung Foundry fighting TSMC. Especially for those who thinks TSMC is somehow a monopoly.
Another things to point out which is not mentioned yet, China is working on both DRAM and NAND. Both LPDDR5 and Stacked NAND are already in production and waiting for yield and scale. Higher Price will finally be perfect timing for them to join the commodity DRAM and NAND race. Good for consumer I suppose, not so good for a lot of other things which I wont go into.
I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.
https://www.tomshardware.com/pc-components/storage/perfect-s...
The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.
Maybe we'll get default to ECC in everything with this?
Is this still the case in 2025, though?
In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.
But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.
Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...
Also, current political turbulence makes planning for the long term extremely risky.
Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?
The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.
High risk requires a correspondingly high potential return.
That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.
But people do need the current production.
A waiver is a waiver. The cost is per square mm. It’s pure supply and demand
No, a wafer is very much not a wafer. DRAM processes are very different from making logic*. You don't just make memory in your fab today and logic tomorrow. But even when you stay in your lane, the industry operates on very long cycles and needs scale to function at any reasonable price at all. You don't just dust off your backyard fab to make the odd bit of memory whenever it is convenient.
Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.
* Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.
I just looked at the invoice for my current PC parts that I bought in April 2016: I paid 177 EUR (~203 USD) for 32GB (DDR4-2800).
It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
Well, I've experienced both to some degree in the past. The previous long time with very similar hardware performance was when PCs were exorbitantly expensive and commodore 64 was the main "home computer" (at least in my country) over the latter 80s and early 90s.
That period of time had some benefits. Programmers learned to squeeze absolutely everything out of that hardware.
Perhaps writing software for today's hardware is again becoming the norm rather than being horribly inefficient and simply waiting for CPU/GPU power to double in 18 months.
I was lucky. I built my am5 7950x Ryzen pc with 2x48gb ddr5 2 years ago. I just bought 4x48gb kit a month ago with an idea to build another home server with the old 2*48gb kit.
Today my old g.skill 2x48gb kit costs Double what I paid for the 4x48gb.
Furthermore I bought two used rtx3090 (for AI) back then. A week ago I bought a third one for the same price... ,(for vram in my server).
> It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
But you’re cherry picking prices from a notable period of high prices (right now).
If you had run this comparison a few months ago or if you looked at averages, the same RAM would be much cheaper now.
We’re just consuming a lot of DRAM in general.
I paid about GBP 20K for the 192MB RAM in a Sun SPARC 5 workstation in 1995. That’s maybe $27K USD in 1995 dollars. Gulp.
I think that goes to show that official inflation benchmarks are not very practical / useful in terms of buckets of things that people actually buy or desire. If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
> If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
A house is $500,000
A GPU is $500
You could put GPUs into the inflation bucket and it wouldn’t change anything. Inflation trackers count cost of living and things you pay monthly, not one time luxury expenses every 4 years that geeks buy for entertainment.
Also we’re likely comparing RAMs at different speeds and memory bandwidth.
If the sticker price stayed the same since 2016, it got about 35% cheaper due to inflation.
I just gave up and built an AM4 system with a 3090 because I had 128G of ddr4 udimms on hand the whole build was for less than just the memory would have cost for an AM5/ddr5 build.
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
My 64gb DDR5 kit started having stability issues running XMP a few weeks out of warranty. I bought it two years ago. Looked into replacing it and the same kit is now double the price. Bumping the voltage a bit and having better cooling gets it through memtest thankfully. The fun of building your own computer is pretty much gone for me these days.
Doubled in the last 4 months https://www.youtube.com/watch?v=o5Zc-FsUDCM
Upgraded by adding 64GB.. last Friday I sold the 32 GB I took out for what I paid for the 64 GB in July... insane
Time to start scouring used-PC sales to reclaim the RAM and sell it for a profit?
Have you not noticed the domain of the submitted article? Others are way, way ahead on that already.
(Including the submitter. In their comment history is "Tip: You can sell used server RAM or desktop modules through BuySellRam to recover value from old hardware." at https://news.ycombinator.com/item?id=45800881 and all of the submissions of this domain are from this user: https://news.ycombinator.com/from?site=buysellram.com )
If you can find used PCs being liquidated with DDR4 RAM that is fast enough for a modern build, then you might.
Old RAM that comes out of the PCs being sold at fire sale prices isn’t really in demand though. Even slower DDR4 grades aren’t seeing much demand.
Why do we all need 128GB now? I was happy with 32.
Close a few Chrome tabs, and save some DDR5 for the rest of us. :-)
Last night, while writing a LaTeX article, with Ollama running for other purposes, Firefox with its hundreds of tabs, multiple PDF files open, my laptop's memory usage spiked up to 80GB RAM usage... And I was happy to have 128GB. The spike was probably due to some process stuck in an effing loop, but the process consuming more and more RAM didn't have any impact on the system's responsiveness, and I could calmly quit VSCode and restart it with all the serenity I could have in the middle of the night. Is there even a case where more RAM is not really better, except for its cost?
> Is there even a case where more RAM is not really better, except for its cost?
It depends. It takes more energy, which can be undesirable in battery powered devices like laptops and phones. Higher end memory can also generate more heat, which can be an issue.
But otherwise more RAM is usually better. Many OS's will dynamically use otherwise unused RAM space to cache filesystem reads, making subsequent reads faster and many databases will prefetch into memory if it is available, too.
Firefox is particularly good at having lots of tabs open and not using tons of memory.
$ ~/dev/mozlz4-tool/target/release/mozlz4-tool \
"$(find ~/Library/Application\ Support/Firefox/Profiles/ -name recovery.jsonlz4 | head -1)" | \
jq -r '[.windows[].tabs | length] | add'
5524
Activity monitor claims firefox is using 3.1GB of ram. Real memory size: 2.43 GB
Virtual memory size: 408.30 GB
Shared memory size: 746.5 MB
Private memory size: 377.3 MB
That said, I wholeheartedly agree that "more RAM less problems". The only case I can think of when it's not strictly better to have more is during hibernation (cf sleep) when the system has to write 128GB of ram to disk.It depends on what you are doing.
If you are working on an application that has several services (database, local stack, etc.) as docker containers, those can take up more memory. Especially if you have large databases or many JVM services, and are running other things like an IDE with debugging, profiling, and other things.
Likewise, if you are using many local AI models at the same time, or some larger models, then that can eat into the memory.
I've not done any 3D work or video editing, but those are likely to use a lot of memory.
I like to tell people I have 128GB. It's pretty rare to meet someone like me that isn't swapping all the time.
The cost of inventory on the shelves basically doesn’t matter. The only thing that matters is the market rate.
If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices, so there wouldn’t actually have been weeks of low retail RAM prices for everyone.
Every once in a while you can catch a retailer whose pricing person missed the memo and forgot to update the retail price when the announcement came out. They go out of stock very rapidly.
> If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices
But that retailer would have made a lot of money in a very short time.
In the scenario where they don't raise prices, they sell out immediately. In the scenario where they do raise prices, it's too expensive so you don't buy it. In the scenario where they keep prices low, and do a lottery to see who can buy them, you don't get picked.
No matter what, you are not getting those modules at the old price. There are few things that trip up people harder than this exact scenario, and it happens everywhere. Concert tickets, limited releases, water during crises, hot Christmas gift, pandemic GPUs, etc.
Once understood you can stop getting mad over it like it's some conspiracy. It's fundamental and natural market behavior.
Yeah you are not alone here being annoyed. I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
Those price increases are not normal at all. I understand that most of it still comes from market demands but this is also skewing the market now in unfair manners. Such increases smell of criminal activity too.
> I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
You want to penalize companies for buying things and penalize companies for selling things are market rate?
There are a lot of good examples through history about how central planning economics and strict price controls do not lead to good outcomes. The end result wouldn’t be plentiful cheap RAM for you. The end result would be no RAM for you at all because the manufacturers choose to sell to other countries who understand basic economics.
Such is life. I suggest finding a less volatile hobby, like crocheting.
Actually, the textile market is pretty volatile in the US these days with Joan's out of business. Pick a poison, I guess? There's little room for stability in a privately-owned-world.
> I have no idea if/when prices will come back down but it sucks.
Usually after the companies are fined for price-fixing
I think it's somewhat useful long term advice, and I would add that parts prices tend to be asynchronous.
Building a PC in a cost efficient manner generally requires someone to track parts prices over years, buy parts at different times, and buy at least a generation behind.
The same applies to many other markets/commodities/etc...
All I can say is,
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
We need better antitrust and anti-monopoly enforcement. Break up the biggest companies, and then they'll have to actually participate in markets.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
Diversity is good for populations. If you have a tiny pool of individuals with mostly the same traits (in this case I mean things like culture, education, morality, ethics, rather than class and race - though there are obvious correlations) then you get what some other comments are describing as being effectively centralized planning with extra steps, rather than a market of competing ideas.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
Exactly. So instead of electing the people who will allocate the resources, the people who are successful in one thing are given the right to manage the resources for whatever they wish and they can keep being very wrong for very long time when other people are deprived from the resources due to the mismanagement and can't do anything about it.
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
It's centralized vs. decentralized not public vs. private. A centralized private planning committee is still centralized.
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
You'll just get a different form of power concentration. Do you think the Soviet Union didn't have power concentration in individuals? Of course it did, that's why the general secretary of the party was more important than the actual heads of state and government.
> But the power concentration is a strong reason.
A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power than the one you're fighting against. You're proposing to cure the common cold with AIDS.
>It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
I disagree.
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
>AI is the first industrial breakthrough in a century
Is it?
Why not follow the time-honoured approach and put the data centres in low-income countries?
The British deindustrialized India.
I assume they don't have good enough power infrastructure.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
Usually companies run out of resources before they screw up global prices in massive markets.
If it was a couple billion dollars of memory purchasing nobody would care.
> Usually companies run out of resources before they screw up global prices in massive markets.
It happens more often than you might expect.
The Onion Futures Act and what led to it is always a fun read: https://en.wikipedia.org/wiki/Onion_Futures_Act
> This is part of how free markets self correct, misallocate resources and you run out of resources.
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
There is a reason why there used to be market regulation and breaking up of monopolies. We are now-a-days trying out changes to the stable state from centuries, because that would be so yesterday, and will soon find out, why that state was chosen in the first place.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
A tax on scale.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
>society is better served by having it produced by the few small companies than the one big company.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
Unless I get all the resources I want, when I want, all at low prices, the market has obviously failed.
Yes, except unironically. A market that cannot efficiently serve the vast majority of the population is a failed market.
Gamers at least enjoy their GPUs and memory.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
What do you think happens when the majority of consumers are priced not only out of bread, but also circuses?
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
> where resources can be massively misallocated
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
It is the market working as expected, but it still failed to allocate money diversely.
OpenAI appears to have bought the DRAM, not to use it, as they are apparently buying it in unfinished form, but explicitly to take it off the market and cause this massive price increase & squash competition.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Going to be awesome tho when OpenAI et al fail because the market is going to be flooded with cheap parts.
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
Tulips also grew and could be bred.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
The market failure results from those people having way more money than logic and economic principles dictate they should. A person would normally have to make a lot of good decisions in a row to get that much money, and would be expected continue making good decisions, but also wouldn't live long enough to reach these extreme amounts. However, repeated misallocation by the federal government over the last several decades (i.e. excessive money printing) resulted in people getting repeatedly rewarded for making the right kind of bad economic decisions instead.
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
I'm so mad about this, I need DDR5 for a new mini-PC I bought and prices have literally gone up by 2.5x..
128GB used to be 400$ in June, and now it's over $1,000 for the same 2x64GB set..
I have no idea if/when prices will come back down but it sucks.