Comment by bcrl

Comment by bcrl 11 hours ago

5 replies

This! The cost of training models inevitably goes down over time as FLOPS/$ and PB/$ increases relentlessly thanks to the exponential gains of Moore's law. Eventually we will end up with laptops and phones being Good Enough to run models locally. Once that happens, any competitor in the space that decides to actively support running locally will have operating costs that are a mere fraction of OpenAI's current business.

The pop of this bubble is going to be painful for a lot of people. Being too early to a market is just as bad as being too late, especially for something that can become a commodity due to a lack of moat.

muskyFelon 11 hours ago
  • bcrl an hour ago

    The number of transistors per unit area is still increasing, it's just a little slower than it was and more expensive than it was.

    And there are innovations that will continue the scaling that Moore's law predicts. Take die stacking as an example. Even Intel had internal studies 20 years ago that showed there are significant performance and power improvements to be had in CPU cores by using 2 layers of transistors. AMD's X3D CPUs are now using technology that can stack extra dies onto a base die, but they're using it in the most basic of ways (only for cache). Going beyond cache to logic, die stacking allows reductions of wire length because more transisters with more layers of metal fit in a smaller space. That in turn improves performance and reduces power consumption.

    The semiconductor industry isn't out of tricks just yet. There are still plenty of improvements coming in the next decade, and those improvements will benefit AI workloads far more than traditional CPUs.

otabdeveloper4 6 hours ago

> increases relentlessly thanks to the exponential gains of Moore's law

Moore's so-called "law" hasn't been true for years.

Chinese AI defeated American companies because they spent effort to optimize the software.

aurareturn 11 hours ago

You just said that everyone will be able to run a powerful AI locally and then you said this would lead to a pop of the bubble.

Well, which is it? That AI is going to have huge demands for chips that it is going to get much bigger or is the bubble going to pop? You can’t have both.

My opinion is that local LLMs will do a bulk of the low value interference such as your personal life mundane tasks. But cloud AI will be reserved for work and for advanced research purposes.

  • bcrl an hour ago

    Just because a bubble pops on the economic front doesn't mean the sector goes away. Pets.com went bust a mere 10 months after going public, yet we're buying all kinds of products online in 2025 that we weren't in 2000. A bubble popping is about the disconnect between the forward looking assumptions about profitability by the early adopters in the space versus the actual returns once the speculation settles down and is replaced by hard data.