Comment by blablabla123

Comment by blablabla123 4 hours ago

19 replies

Despite the flashy title that's the first "sober" analysis from a CEO I read about the technology. While not even really news, it's also worth mentioning that the energy requirements are impossible to fulfill

Also now using ChatGPT intensely since months for all kinds of tasks and having tried Claude etc. None of this is on par with a human. The code snippets are straight out of Stackoverflow...

delaminator 4 hours ago

Your assessment of Claude simply isn’t true.

Or Stackoverflow is really good.

I’m producing multiple projects per week that are weeks of work each.

  • bloppe 3 hours ago

    Would you mind sharing some of these projects?

    I've found Claude's usefulness is highly variable, though somewhat predictable. It can write `jq` filters flawlessly every time, whereas I would normally spend 30 minutes scanning docs because nobody memorizes `jq` syntax. And it can comb through server logs in every pod of my k8s clusters extremely fast. But it often struggles making quality code changes in a large codebase, or writing good documentation that isn't just an English translation of the code it's documenting.

    • blazingbanana 17 minutes ago

      Not the OP you're replying to, but I've put together quite a few projects using only LLMs, no hand crafted code anywhere (I couldn't do it!)

      https://dnbfamily.com

      https://eventme.app

      https://blazingbanana.com

      https://play.google.com/store/apps/details?id=com.blazingban...

      https://play.google.com/store/apps/details?id=com.blazingban...

      https://play.google.com/store/apps/details?id=com.blazingban...

      Are they perfect? No probably not, but I wouldn't have been able to make any of these without LLMs. The last app was originally built with GPT-3.5.

      There is a whole host of other non-public projects I've built with LLMs, these are just a few of the public ones.

    • gloosx 2 hours ago

      It is always "I'm producing 300 projects in a nanosecond" but it's almost never about sharing or actually deploying these ;)

      • freehorse an hour ago

        At this point my prior is that all these 300/ns projects are some kind of internal tools, with very narrow scope and many just for a one-off use.

        Which is also fine and great and very useful and I am also making those, but it probably does not generalize to projects that require higher quality standards and actual maintenance.

      • DoctorOW an hour ago

        The problem I had that the larger your project gets, the more mistakes Claude makes. I (not a parent commenter) started with a basic CRUD web app and was blown away by how detailed it was, new CSS, good error handling, good selection and use of libraries, it could even write the terminal commands for package management and building. As the project grew to something larger Claude started forgetting that some code already existed in the project and started repeating itself, and worse still when I asked for new features it would pick a copy at random leaving them out of sync with eachother. Moving forward I've been alternating between writing stuff with AI, then rewriting it myself.

    • wartywhoa23 an hour ago

      They really should have been supplying at least a week worth of readymade "projects" to every freelance AI promoter out there to demonstrate x9000 AI productivity gains for the skeptics.

      Because vibing the air about those gains without any evidence looks too shilly.

    • steve_adams_86 3 hours ago

      Claude has taught me so much about how to use jq better. And really, way more efficient ways of using the command line in general. It's great. Ironically, the more I learn the less I want to ask it to do things.

  • written-beyond 4 hours ago

    I'm just as much of an avid llm code generator fan as you may be but I do wonder about the practicality of spending time making projects anymore.

    Why build them if other can just generate them too, where is the value of making so many projects?

    If the value is in who can sell it the best to people who can't generate it, isn't it just a matter of time before someone else will generate one and they may become better than you at selling it?

    • jstummbillig 4 hours ago

      The value is that we need a lot more software and now, because building software has gotten so much less time consuming, you can sell software to people that could/would not have paid for it previously at a different price point.

      • eschaton 3 hours ago

        We don’t need more software, we need the right software implemented better. That’s not something LLMs can possibly give us because they’re fucking pachinko machines.

        Here’s a hint: Nobody should ever write a CRUD app, because nobody should ever have to write a CRUD app; that’s something that can be generated fully and deterministically (i.e. by a set of locally-executable heuristics, not a goddamn ocean-boiling LLM) from a sufficiently detailed model of the data involved.

        In the 1970s you could wire up an OS-level forms library to your database schema and then serve literally thousands of users from a system less powerful than the CPU in modern peripheral or storage controller. And in less RAM too.

        People need to take a look at what was done before in order to truly have a proper degree of shame about how things are being done now.

  • blablabla123 4 hours ago

    Sure but these are likely just variations of existing things. And yet the quality is still behind the original

  • eschaton 3 hours ago

    I produce a lot of shit every week too, but I don’t brag about my digestive system on “Hacker” “News.”

will4274 3 hours ago

> While not even really news, it's also worth mentioning that the energy requirements are impossible to fulfill

If you believe this, you must also believe that global warming is unstoppable. OpenAI's energy costs are large compared to the current electricity market, but not so large compared to the current energy market. Environmentalists usually suggest that electrification - converting non-electrical energy to electrical energy - and then making that electrical energy clean - is the solution to global warming. OpenAI's energy needs are something like 10% of the current worldwide electricity market but less than 1% of the current worldwide energy market.

  • blablabla123 16 minutes ago

    Google recently announced to double AI data center capacity every 6 month. While both unfortunately deal with exponential growth, we are talking about 1% growth CO2 which is bad enough vs 300% effectively per year according to Google

  • rvnx 3 hours ago

    Imagine how big pile of trash as the current generation of graphics cards used for LLM training will get outdated. It will crash the hardware market (which is a good news for gamers)