Comment by yapyap

Comment by yapyap 8 days ago

1 reply

> Context on Code Quality (via HackerNews): The HackerNews discussion included valid critiques regarding the code quality in this specific Python project example (e.g., logger configuration, custom config parsing, potential race conditions). It’s a fair point, especially given I’m not a Python expert. For this particular green-field project, my primary goal was rapid prototyping and achieving a working solution in an unfamiliar stack, prioritizing the functional outcome over idiomatic code perfection or optimizing for long-term maintainability in this specific instance. It served as an experiment to see how far AI could bridge a knowledge gap. In brown-field projects within my areas of expertise, or projects demanding higher long-term maintainability, the human review, refinement, and testing process (using the guardrails discussed later) is necessarily much more rigorous. The critiques highlight the crucial role of experienced oversight in evaluating and refining AI-generated code to meet specific quality standards.

We all know how big companies handle software, if it works ship it. Basically once this shit starts becoming very mainstream companies will want to shift into their 5x modes (for their oh so holy investors that need to see stock go up, obviously.)

So once this sloppy prototype is seen as working they will just ship the shit sandwhich prototype. And the developers won’t know what the hell it means so when something breaks in the future, and that is when not if. They will need AI to fix it for them, cause once again they do not understand what is going on.

What I’m seeing here is you proposing replacing one of your legs with AI and letting it do all the heavy lifting, just so you can lift heavier things for the moment.

Once this bubble crumbles the technical debt will be big enough to sink companies, I won’t feel sorry for any of the AI boosties but do for their families that will go into poverty

namaria 7 days ago

Yup this is full on addiction mechanics. You use these generative tools, it feels great, the team and the organization feel great. But it's not warranted, the underlying thing is inherently flawed.

When the good feeling fades and you need to up the dosage, you will find that your ability to function is declining and your dependency on the generative tools is increasing. Besides no one is thinking about the end game. If (and its a big if) this goes to plan and these generative tools can do everything. Well at that point the only software needed is the generative tool itself isn't it? There would be no need for anything else so anyone building stuff on top of it, or using it to build stuff, would be SOL.

So best case scenario we all get addicted to fundamentally flawed technology because our ability to function independently has eroded too far, worst case there will be only foundational model companies operating in software.