Comment by exitb

Comment by exitb a day ago

21 replies

Isn't this a bit revisionist? I started to become interested in programming around late 90s and I don't remember anyone floating the idea that OOP, libraries or IDEs will make programming obsolete as a profession. If anything, pre-2023 most programmers considered their job as one of the hardest ones to automate.

coldpie a day ago

> I started to become interested in programming around late 90s and I don't remember anyone floating the idea that OOP, libraries or IDEs will make programming obsolete as a profession.

The version of this hype that I remember from circa 2004 was UML[1] was going to make most programming automated. You'd have an architect that would draw out your problem's architecture in a GUI[2], press a button to automate all the code to build that architecture, and have a programmer fill in a couple dozen lines of business logic. Boom, program done by two or three people in a couple weeks, let's all go home. It uh, didn't work out that way.

You can read a lot more about all this by following the various links to concepts & products from Rational's Wikipedia page, https://en.wikipedia.org/wiki/Rational_Software (the Rational Unified Process page in particular brings back some memories). It wasn't badly intentioned, but it was a bit of a phase that the industry went through that ultimately didn't work out.

[1] https://en.wikipedia.org/wiki/UML

[2] https://en.wikipedia.org/wiki/File:Component-based-Software-...

  • HPsquared 18 hours ago

    It's interesting that even with the rise of transformers, UML still isn't popular. I wonder if that, or some other visual way of representing specs, might make a comeback.

rokob a day ago

There was definitely a widely held belief in the late 90s, early 00s that programming was commoditized to the point that it would be fully offshored to the lowest cost of labor. This happened in some areas and failed. It still happens now and then. But I remember hearing some of that based on OO and libraries making it so unskilled people could just put together legos.

  • Al-Khwarizmi a day ago

    I remember that. I studied CS in that period and some professors were convinced that software development was going to become an unskilled job, analogous to bricklaying, and that our goal as future CS graduates should be to become managers, just like someone that studies a university degree about making buildings is intended to become an architect and not a bricklayer.

    I never believed it, though (if I had, I would probably have switched degrees, as I hate management). And while the belief was common, my impression is that it was only so among people who didn't code much. The details on how it would happen were always highly handwavy and people defending that view had a tendency to ignore any software beyond standard CRUD apps.

    In contrast, if I had to choose a degree right now, I'd probably avoid CS (or at most study it out of passion, like one could study English philology or something, but without much hope of it being a safe choice for my career). I think the prospects for programmers in the LLM era look much scarier, and the threats look much more real, than they ever did in that period.

    • ghaff 21 hours ago

      The bigger issue is that so many people have jumped into CS because programming (not the same thing I know) has become seen as this thing that will earn you big bucks.

      Of course, some level of computer skills is important in most professions at this point. But logic suggests that CS (and programming) compensation will level out at a level comparable to similarly skilled technical professions.

  • Cthulhu_ a day ago

    It's a bit too generalizing that it failed and happens "now and then", offshoring is a multi-billion industry employing millions of people.

    And the "unskilled people putting together legos" is also very much a thing in the form of low/no-code platforms, from my own circles there's Mendix and Tibco, arguably SAP, and probably a heap more. Arguably (my favorite word atm) it's also still true in most software development because outside of coding business logic, most heavy lifting is done by the language's SDK and 3rd party libraries.

tclancy a day ago

>I don't remember anyone floating the idea that OOP, libraries or IDEs

Oh man, you had it lucky. Object databases were going to replace SQL multiple times, XML would eat the world and I strongly remember a UX person taking one look at Ruby on Rails at maybe 1.0 and declaring he would not be needing us programmers anymore.

  • paganel 19 hours ago

    > Object databases were going to replace SQL multiple times

    Someone should honestly write a history of Zope. Can't say I hated it, because it was my first paid job as a programming working with it, but it was special, very special, let's call it that. 20 years from that moment I do think though that it had its good moments, even bright ideas in some places, too bad that they didn't scale.

munificent 19 hours ago

It had definitely been a thing even from the 80s, though everyone always seemed to think it would be the next generation of languages that did it. For example, in 1981, James Martin wrote a book called "Application Development Without Programmers".

QuercusMax 19 hours ago

I remember reading 25-30 years ago about how 4GLs and object libraries were going to democratize software creation. Don't recall it being sold as an apocalypse for coders, though.

absqueued a day ago

I find the idea that IntelliJ being a job killer hard to believe, just like when some of my colleagues used to think Dreamweaver would wipe out frontend development - or 'HTML slicing', as we called it back then.

  • QuercusMax 19 hours ago

    Yeah, that's weird; IntelliJ was more like "this is how amazing and friction-free Java development and refactoring can be". Enabling more ppl to be 10x programmers, not putting ppl out of work.

stavros a day ago

Yeah, I can confirm, before LLMs I definitely thought coding would be the last thing to go.

  • mistersquid a day ago

    > before LLMs I definitely thought coding would be the last thing to go.

    While LLMs do still struggle to produce high quality code as a function of prompt quality and available training data, many human software developers are surprised that LLMs (software) can generate quality software at all.

    I wonder to what extent this surprise is because people tend to think very deeply when writing software and assume thinking and "reasoning" are what produce quality software. What if the experience of "thinking" and "reasoning" are epiphenomena of the physical statistical models present in the connections of our brains?

    This is an unsolved and ancient philosophical problem (i.e. the problem of duality) of whether consciousness and free will affect the physical world. If we live in a materialist universe where matter and the laws of physics are unaffected by consciousness then "thinking", "reasoning", and "free will" are purely subjective. In such a view, subjective experience attends material changes in the world but does not affect the material world.

    Software developers surprised by the capabilities of software (LLMs) to write software might not be so surprised if they understood consciousness as an epiphenomenon of materiality. Just as words do not cause diaphragms to compress lungs to move air past vocal cords and propagate air vibrations, perhaps the thoughts that attend action (including the production of words) are not the motive force of those actions.

    • autoexec 20 hours ago

      > I wonder to what extent this surprise is because people tend to think very deeply when writing software and assume thinking and "reasoning" are what produce quality software.

      It takes deep thought and reasoning to produce good code. LLMs don't think or reason. They don't have to though because humans have done all of that for them. They just have to regurgitate what humans have already done. Everything good an LLM outputs came from the minds of humans who did all the real work. Sometimes they can assemble bits of human generated code in ways that do something useful, just like someone copying and pasting code out of stack exchange without understanding any of it can sometimes slap something together that does something useful.

      LLMs are a neat party trick, and it can be surprising to see what they do and fun to see where they fail, but it all says very little about what it means to think and reason or even what it means to write software.

  • stickfigure a day ago

    I'm still of the opinion that coding will be the last thing to go. LLMs are an enabler, sure, but until they integrate some form of neuroplasticity they're stuck working on Memento-guy-sized chunks of code. They need a human programmer to provide long context.

    Maybe some new technique will change that, but it's not guaranteed. At this point I think we can safely surmise that scaling isn't the answer.

    • forgetfulness 20 hours ago

      I’m more inclined to believe that no jobs (as in trades, professions) will go, but programming will be the most automated, along with design and illustration.

      Why? To this day still they’re the showcase of what LLMs “can” do for (to) a line of work, but they’re the only ones with all the relevant information online.

      For programming, there’s decades of textbooks, online docs, bug tracker tickets, source code repositories, troubleshooting on forums, all laying out how a profession is exercised from start to finish.

      There’s hardly a fraction of this to automate the tasks of the average Joe who does some paperwork the model has never seen, who’s applying some rough procedures we would call “heuristics” to some spreadsheets and emails, and has to escalate to his supervisor for things out of code several times a day.

  • jollyllama 18 hours ago

    Beyond training data availability, it's always easiest to automate what you understand. Since software engineering is a subset of the discipline of AI/LLMs, it has been automated to the extent that it has. Everything else involves more domain knowledge.

  • zoeysmithe 21 hours ago

    I'm not sure what happens when you replace coders with 'prompt generalists' and the output has non-trivial bugs. What do you do then? The product is crashing and the business is losing money? Or a security bug? You can't just tell llm's "oh wait what you made is bad, make it better." At a certain point, that's the best it can make. And if you dont understand the security or engineering issue behind the bug, even if the llm can fix this, you don't have the skills to prompt it correctly to do so.

    I see tech as 'the king's guard' of capitalism. They'll be the last to go because at the end of the day, they need to be able to serve the king. 'Prompt generalists' are like replacing the king's guard with a bunch of pampered royals who 'once visited a battlefield.' Its just not going to work when someone comes at the king.

    • autoexec 20 hours ago

      > You can't just tell llm's "oh wait what you made is bad, make it better." At a certain point, that's the best it can make. And if you dont understand the security or engineering issue behind the bug, even if the llm can fix this, you don't have the skills to prompt it correctly to do so.

      In that case, the idea is that you'd see most programmers in the company replaced by a much smaller group of prompt generalists who work for peanuts, while the company keeps on a handful of people who actually know how to program and do nothing all day long but debug AI written code.

      When things crash or a security issue comes up they bring in the team of programmers, but since they only need a small number of them to get the AI code working again most programmers would be out of a job. High numbers of people who actually like touching code for a living will compete for the very small number of jobs available driving down wages.

      In the long term, this would be bad because a lot of talented coders won't be satisfied being QA to AI slop and will move on to other passions. Everything AI knows it learned from people who had the skill to do great things, but once all the programmers are just debugging garbage AI code there will be fewer programmers doing clever things and posting their code for AI to scrape and regurgitate. Tech will stagnate since AI can't come up with anything new and will only have its own slop to learn from.

      Personally, I doubt it'll happen that way. I'm skeptical that LLMs will become good enough to be a real threat. Eventually the AI bubble will burst as companies realize that chatbots aren't ever going to be AGI, will never get good enough to replace most of their employees, and once they see that they're still going to be stuck paying the peasant class things will slowly get back to normal.

      In the meantime, expect random layoff and rehires (at lower wages) as companies try and fail to replace their pesky human workers with AI, and expect AI to be increasingly shoehorned into places it has no business being and screwing things up making your life harder in new and frustrating ways