Comment by overfeed

Comment by overfeed 9 hours ago

13 replies

You're not doing yourself any favors by labeling people who disagree with you undereducated or uninformed. There is enough over-hyped products/techniques/models/magical-thinking to warrant skepticism. At the root of this thread is an argument to (paraphrasing) encouraging people to just wait until someone solves major problems instead of tackling it themselves. This is a broad statement of faith, if I've ever seen one, in a very religious sense: "Worry not, the researchers and foundation models will provide."

My skepticism and intuition that AI innovations are not exponential, but sigmoid are not because I don't understand what gradient-descent, transformers, RAG, CoT, or multi-head attention are. My statement of faith is: the ROI economics are going to catch up with the exuberance way before AGI/ASI is achieved; sure, you're getting improving agents for now, but that's not going to justify the 12- or 13-digit USD investments. The music will stop, and improvements slow to a drip

Edit: I think at it's root, the argument is between folk who think AI will follow the same curve as past technological trends, and those who believe "It's different this time".

bdangubic 8 hours ago

> labeling people who disagree with you undereducated or uninformed

I did neither of these two things... :) I personally could not care about

- (over)hype

- 12/13/14/15 ... digit USD investment

- exponential vs. sigmoid

There are basically two groups of industry folk:

1. those that see technology as absolutely transformational and are already doing amazeballs shit with it

2. those that argue how it is bad/not-exponential/ROI/...

If I was a professional (I am) I would do everything in my power to learn everything there is to learn (and then more) and join the Group #1. But it is easier to be in Group #2 as being in Group #1 requires time and effort and frustrations and throwing laptop out the window and ... :)

  • gmm1990 8 hours ago

    If there is really amazing stuff happening with this technology how did we have two recent major outages that were cause by embarrassing problems? I would guess that at least in the cloud flare instance some of the responsible code was ai generated

    • ctoth 7 hours ago

      > I would guess that at least in the cloud flare instance some of the responsible code was ai generated

      Your whole point isn't supported by anything but ... a guess?

      If given the chance to work with an AI who hallucinates sometimes or a human who makes logical leaps like this

      I think I know what I'd pick.

      Seriously, just what even? "I can imagine a scenario where AI was involved, therefore I will treat my imagination as evidence."

      • gmm1990 an hour ago

        The whole point is that the outages happened not that the ai code caused them. If ai is so useful/amazing then these outages should be less common not more. It’s obviously not rock solid evidence. Yeah ai could be useful and speed up or even improve a code base but there isn’t any evidence that that’s actually improving anything the only real studies point to imagined productivity improvements

      • __loam 4 hours ago

        Microsoft is saying they're generating 30% of their code now and there's clearly been a lot of stability issues with Windows 11 recently that they've publicly acknowledged. It's not hard to tell a story that involves layoffs, increased pressure to ship more code, AI tools, and software quality issues. You can make subtle jabs about your peers as much as you want but that isn't going to change public perception when you ship garbage.

    • bdangubic 5 hours ago

      good thing before “ai” when humans coded we had many decades of no outages… phew

  • wat10000 an hour ago

    I see the first half of group 1, but where's the second half? Don't get me wrong, there's some cool and interesting stuff in this space, but nothing I'd describe as close to "amazeballs shit."

  • overfeed 6 hours ago

    A mutually exclusive group 1 & group 2 are a false dichotomy. One can have a grasp on the field and keep up to date with recent papers, have an active Claude subscription, use agents and still have a net-negative view of "AI" as a whole, considering the false promises, hucksters, charlatans and an impending economic reckoning.

    tl;dr version: having negative view of the industry is decoupled from one's familiarity with, and usage of the tools, or the willingness to learn.

    • bdangubic 5 hours ago

      > considering the false promises, hucksters, charlatans and an impending economic reckoning.

      I hack for a living. I could hardly give two hoots about “false promises” or “hucksters” or some “impeding economic reckoning…” I made a general comment that a whole lot of people simple discount technology on technical grounds (favorite here on HN)…

      • overfeed 4 hours ago

        > I could hardly give two hoots about “false promises” or “hucksters”

        I suppose this is the crux of our misunderstanding: I deeply care about the long-term health and future of the field that gave me a hobby that continues to scratch a mental itch with fractal complexity/details, a career, and more money than I ever imagined.

        > or some “impeding economic reckoning…”

        I'm not going to guess if you missed the last couple of economic downturns or rode them out, but an economic reckoning may directly impact your ability to hack for a living, that's the thing you prize.

      • __loam 4 hours ago

        You should give a shit about how the field is perceived because that affects your ability to make a living whether you care about it or not

  • __loam 4 hours ago

    Amazeballs shit yet precious little actual products.

juped 6 hours ago

They're not logistic, this is a species of nonsense claim that irks me even more than claiming "capabilities gains are exponential, singularity 2026!"; it actually includes the exponential-gains claim and then tries to tack on epicycles to preempt the lack of singularities.

Remember, a logistic curve is an exponential (so, roughly, a process whose outputs feed its growth, the classic example being population growth, where more population makes more population) with a carrying capacity (the classic example is again population, where you need to eat to be able to reproduce).

Singularity 2026 is open and honest, wearing its heart on its sleeve. It's a much more respectable wrong position.