Comment by zdragnar

Comment by zdragnar 14 hours ago

21 replies

What is OpenAI's moat? There's plenty of competitors running their own models and tools. Sure, they have the ChatGPT name, but I don't see them massively out-competing the entire market unless the future model changes drastically improve over the 3->4->5 trajectory.

nojs 13 hours ago

It feels similar to Google to me - what is (was) their moat? Basically slightly better results and strong brand recognition. In the later days maybe privileged data access. But why does nobody use Bing?

  • zdragnar 12 hours ago

    Google got a massive leg up on the rest be having a better service. When Bing first came out, I was not impressed with what I got, and never really bothered going back to it.

    Search quality isn't what it used to be, but the inertia is still paying dividends. That same inertia also applied to Google ads.

    I'm not nearly so convinced OpenAI has the same leg up with ChatGPT. ChatGPT hasn't become a verb quite like google or Kleenex, and it isn't an indispensable part of a product.

    • typpilol 12 hours ago

      I actually find bing better now for more technical searches.

      Most technical Google searches end up at win fourms or official Microsoft support site which is basically just telling you that running sfc scannow for everything is the fix.

      • 6031769 5 hours ago

        If you are ending up at win forums or Microsoft's support site then the chances are that you were searching for something Microsofty in the first place. And if that's the case then it's hardly surprising that Microsoft's own search engine is better for promoting Microsoft-related responses than any other.

        Try searching for something technical which isn't MS-specific. That should be a more neutral test.

  • balder1991 12 hours ago

    Google has always been much better than the competition. Even today with their enshittification, competitors still aren’t as good.

    The only thing that has changed that status quo is the rise of audiovisual media and sites closing up so that Google can’t index them, which means web search lost a lot of relevance.

  • HDThoreaun 10 hours ago

    google's moat is a combination of it being free and either being equal to or outright better than competitors

bobby_mcbrown 13 hours ago

It's Sam.

From what I understand he was the only one crazy enough to demand hundreds of GPUs for months to get ChatGPT going. Which at the time sounded crazy.

So yeah Sam is the guy with the guts and vision to stay ahead.

  • shermantanktop 13 hours ago

    Past performance is no guarantee of future results.

    You might see Sam as a Midas who can turn anything into gold. But history shows that very few people sustain that pattern.

  • croes 13 hours ago

    OpenAI isn't ahead

bcrl 13 hours ago

This! The cost of training models inevitably goes down over time as FLOPS/$ and PB/$ increases relentlessly thanks to the exponential gains of Moore's law. Eventually we will end up with laptops and phones being Good Enough to run models locally. Once that happens, any competitor in the space that decides to actively support running locally will have operating costs that are a mere fraction of OpenAI's current business.

The pop of this bubble is going to be painful for a lot of people. Being too early to a market is just as bad as being too late, especially for something that can become a commodity due to a lack of moat.

  • muskyFelon 13 hours ago
    • bcrl 3 hours ago

      The number of transistors per unit area is still increasing, it's just a little slower than it was and more expensive than it was.

      And there are innovations that will continue the scaling that Moore's law predicts. Take die stacking as an example. Even Intel had internal studies 20 years ago that showed there are significant performance and power improvements to be had in CPU cores by using 2 layers of transistors. AMD's X3D CPUs are now using technology that can stack extra dies onto a base die, but they're using it in the most basic of ways (only for cache). Going beyond cache to logic, die stacking allows reductions of wire length because more transisters with more layers of metal fit in a smaller space. That in turn improves performance and reduces power consumption.

      The semiconductor industry isn't out of tricks just yet. There are still plenty of improvements coming in the next decade, and those improvements will benefit AI workloads far more than traditional CPUs.

  • otabdeveloper4 8 hours ago

    > increases relentlessly thanks to the exponential gains of Moore's law

    Moore's so-called "law" hasn't been true for years.

    Chinese AI defeated American companies because they spent effort to optimize the software.

  • aurareturn 13 hours ago

    You just said that everyone will be able to run a powerful AI locally and then you said this would lead to a pop of the bubble.

    Well, which is it? That AI is going to have huge demands for chips that it is going to get much bigger or is the bubble going to pop? You can’t have both.

    My opinion is that local LLMs will do a bulk of the low value interference such as your personal life mundane tasks. But cloud AI will be reserved for work and for advanced research purposes.

    • bcrl 3 hours ago

      Just because a bubble pops on the economic front doesn't mean the sector goes away. Pets.com went bust a mere 10 months after going public, yet we're buying all kinds of products online in 2025 that we weren't in 2000. A bubble popping is about the disconnect between the forward looking assumptions about profitability by the early adopters in the space versus the actual returns once the speculation settles down and is replaced by hard data.