Comment by tptacek

Comment by tptacek a day ago

60 replies

After all, if we lose the joy in our craft, what exactly are we optimizing for?

Solving problems for real people. Isn't the answer here kind of obvious?

Our field has a whole ethos of open-source side projects people do for love and enjoyment. In the same way that you might spend your weekends in a basement woodworking shop without furnishing your entire house by hand, I think the craft of programming will be just fine.

frollogaston a day ago

Same as when higher-level languages replaced assembly for a lot of use cases. And btw, at least in places I've worked, better traditional tooling would replace a lot more headcount than AI would.

  • codr7 a day ago

    Not even close, those were all deterministic, this is probabilistic.

    • tptacek a day ago

      The output of the LLM is probabilistic. The code you actually commit or merge is not.

      • discreteevent a day ago

        The parent is saying that when higher-level languages replaced assembly languages you only had to learn the higher level language. Once you learned the higher level language the machine did precisely what you specified and you did not have to inspect the assembly language to make sure it was compliant. Furthermore you were forced to be precise and to understand what you were doing when you were writing the higher level language.

        Now you don't really have to be precise at any level to get something 'working'. You may not be familiar with the generated language or libaries but it could look good enough (like the assembly would have looked good enough). So, sure, if you are very familiar with the generated language and libraries and you inspect every line of generated code then maybe you will be ok. But often the reason you are using an LLM is because e.g. you don't understand or use bash frequently enough to get it to do what you want. Well, the LLM doesn't understand it either. So that weird bash construct that it emitted - did you read the documentation for it? You might have if you had to write it yourself.

        In the end there could be code in there that nothing (machine or human) understands. The less hard-won experience you have with the target and the more time-pressed you are the more likely it is that this will occur.

      • ben-schaaf 16 hours ago

        Exactly. If LLMs were like higher level languages you'd be committing the prompt. LLMs are actually like auto-complete, snippets, stackoverflow and rosetta code. It's not a higher level of abstraction, it's a tool for writing code.

      • rozap a day ago

        i'm just vibing though, maybe i merge, maybe i don't, based on the vibes

      • jll29 a day ago

        Yes.

        The output of the LLM is determined by the weights (parameters of the artificial neural network) estimated in the training as well as a pseudo-random number generator (unless its influence, called "temperature", is set to 0).

        That means LLMs behave as "processes" rather than algorithms, unlike any code that may be generated from them, which is algorithmic (unless instrcuted otherwise; you could also tell an LLM to generate an LLM).

      • pjmlp 19 hours ago

        The code that the compiler generates, especially in the C realm, or with dynamic compilers is also not regular, hence the tooling constraints in high integrity computing environments.

    • frollogaston a day ago

      So what? I know most compilers are deterministic, but it really only matters for reproducible builds, not that you're actually going to reason about the output. And the language makes few guarantees about the resulting instructions.

    • eddd-ddde a day ago

      Yet the words you chose to use in this comment were entirely modelled inside your brain in a not so different manner.

  • pjmlp 21 hours ago

    I already see this happening with low code, SaaS and MACH architectures.

    What used to be a project doing a CMS backend, now is spent doing configurations on a SaaS product, and if we are lucky, a few containers/serveless for integrations.

    There are already AI based products that can automate those integrations if given enough data samples.

    Many believe AI will keep using current programming languages as translation step, just like those Assembly developers thought compiling via Assembly text generation and feeding into an Assembly would still be around.

    • achierius 17 hours ago

      > just like those Assembly developers thought compiling via Assembly text generation and feeding into an Assembly would still be around

      Confused by what you mean. Is this not the case?

      • pjmlp 16 hours ago

        No, only primitive UNIX toolchains still do this, most modern compilers generate machine code directly, without having to generate Assembly text files and executing the Assembler process on it.

        You can naturally revert to old ways, by asking for the Assembly manually, and call the Assembler yourself.

JohnFen a day ago

> Solving problems for real people. Isn't the answer here kind of obvious?

No. There are a thousand other ways of solving problems for real people, so that doesn't explain why some choose software development as their preferred method.

Presumably, the reason for choosing software development as the method of solving problems for people is because software development itself brings joy. Different people find joy in different aspects even of that, though.

For my part, the stuff that AI is promising to automate away is much of the stuff that I enjoy about software development. If I don't get to do that, that would turn my career into miserable drudgery.

Perhaps that's the future, though. I hope not, but if it is, then I need to face up to the truth that there is no role for me in the industry anymore. That would pretty much be a life crisis, as I'd have to find and train for something else.

  • simonw a day ago

    "There are a thousand other ways of solving problems for real people, so that doesn't explain why some choose software development as their preferred method."

    Software development is almost unique in the scale that it operates at. I can write code once and have it solve problems for dozens, hundreds, thousands or even millions of people.

    If you want your work to solve problems for large numbers of people I have trouble thinking of any other form of work that's this accessible but allows you to help this many others.

    Fields like civil engineering are a lot harder to break into!

  • JodieBenitez a day ago

    > That would pretty much be a life crisis, as I'd have to find and train for something else.

    There's inertia in the industry. It's not like what you're describing could happen in the blink of an eye. You may well be at the end of your career when this prophecy is fulfilled, if it ever comes true. I sure will be at the end of mine and I'll probably work for at least another 20 years.

    • jll29 a day ago

      The inertia argument is real, and I would compare it to the mistaken believe of some at IBM in the 1970s that SQL would be used by managers to query relational databases directly, so no programming was needed anymore.

      And what happened? Programmers make the queries and embed them into code that creates dashboards that managers look at. Or managers ask analysts who have to interpret the dashboards for them... It rather created a need for more programmers.

      Compare embedded SQL with prompts - SQL queries compared to assembler or FORTRAN code is closer to English prose for sure. Did it take some fun away? Perhaps, if manually traversing a network database is fun to anyone, instead of declaratively specifying what set of data to retrieve. But it sure gave new fun to people who wanted to see results faster (let's call them "designers" rather than "coders"), and it made programming more elegant due to the declarativity of SQL queries (although that is cancelled out again by the ugliness of mixing two languages in the code).

      Maybe the question is: Does LLM-based coding enable a new kind of higher level "design flow" to replace "coding flow"? (Maybe it will make a slightly different group of people happy?)

      • submain 18 hours ago

        This echoes my sentiment that LLMs are higher level programming languages. And, as every layer of abstraction, they add assumptions that may or may not fit the use case. The same way we optimize SQL queries by knowing how the database makes a query plan, we need to optimize LLM outputs, specially when the assumptions given are not ideal.

  • threatofrain a day ago

    > No. There are a thousand other ways of solving problems for real people, so that doesn't explain why some choose software development as their preferred method.

    I don't see why we should seek an explanation if there are thousands of ways to be useful to people. Is being a lawyer particularly better than being an accountant?

  • fragmede a day ago

    I'm probably just not as smart or creative as you, but say my problem is I have a ski cabin that I want to rent it to strangers for money. Nevermind a thousand, What are 100 ways without using software that I could do something about that, vs listing it on Airbnb?

    • JohnFen 16 hours ago

      I was speaking about solving people's problems generally. It's easy to find specific problems that are best addressed with software, just as it's easy to find specific problems that can't be addressed with software.

EasyMarion a day ago

solving real problems is the core of it, but for a lot of people the joy and meaning come from how they solve them too. the shift to AI tools might feel like outsourcing the interesting part, even if the outcome is still useful. side projects will stick around for sure, but i think it's fair to ask what the day-to-day feels like when more of it becomes reviewing and prompting rather than building.

ToucanLoucan a day ago

> Solving problems for real people. Isn't the answer here kind of obvious?

Look at the majority of the tech sector for the last ten years or so and tell me this answer again.

Like I guess this is kind of true, if "problems for real people" equals "compensating for inefficiencies in our system for people with money" and "solutions" equals "making a poor person do it for them and paying them as little as legally possible."

  • tptacek a day ago

    Those of us who write software professionally are literally in a field premised on automating other people's jobs away. There is no profession with less claim to the moral high ground of worker rights than ours.

    • simonw a day ago

      I often think about the savage job-destroying nature of the open source community: hundreds of thousands of developers working tirelessly to unemploy as many of their peers as possible by giving away the code they've written for free.

      (Interesting how people talk about AI destroying programming jobs all the time, but rarely mention the impact of billions of dollars of code being given away.)

      • chubot a day ago

        Would vim or python be created by a company? It’s hard to see how they take jobs away

        Open source software is not just different in the license, it’s different in the design

        Linux also doesn’t take jobs away - the majority of contributors are paid by companies, afaik

        • simonw a day ago

          Right: that's the point. Open source has created millions of jobs by increasing the value that individual software developers can provide.

    • JohnFen a day ago

      > Those of us who write software professionally are literally in a field premised on automating other people's jobs away.

      How true that is depends on what sort of software you write. Very little of what I've accomplished in my career can be fairly described as "automating other people's jobs away".

    • concats a day ago

      "Ten year contract you say?"

      "Yes, yes... Satellites stay in orbit for a while. What about it?"

      "Looks a bit cramped in there."

      "Stop complaining, at least it's a real job, now get in, we're about to launch."

    • Verdex a day ago

      Speak for yourself.

      I've worked in a medical space writing software so that people can automate away the job that their bodies used to do before they broke.

      • mopenstein 19 hours ago

        You're automating the 1's and 0's. There could be millions of people in an assembly like line of buttons, being paid minimum wage to press either the 1 or 0 button to eventually trigger the next operation.

        Now all those jobs are gone because of you.

    • smj-edison a day ago

      Bit of a tangent but...

      Haven't we been automating jobs away since the industrial revolution? I know AI may be an exception to this trend, but at least with classical programming, demand goes up, GDP per capita goes up, and new industries are born.

      I mean, there's three ways to get stuff done: do it yourself, get someone else to do it, or get a machine to do it.

      #2 doesn't scale, since someone still has to do it. If we want every person to not be required to do it (washing, growing food, etc), #3 is the only way forward. Automation and specialization have made the unthinkable possible for an average person. We've a long way to go, but I don't see automation as a fundamentally bad thing, as long as there's a simultaneous effort to help (especially those who are poor) transition to a new form of working.

      • jll29 a day ago

        We have always automated, because we can.

        What is qualitatively different this time is that it affects intellectual abilities - there is nothing higher up in the work "food chain". Replacing physical work you could always argue you'd have time to focus on making decisions. Replacing decision making might mean telling people go sit on the beach and take your universal basic income (UBI) cheque, we don't need you anymore.

        Sitting on the beach is not as nice as it sounds for some; if you don't agree, try doing it for 5 years. Most people require work to have some sense of purpose, it gives identity, and it structures their time.

        Furthermore, if you replaced lorry drivers with self-driving cars, you'd destroy the most commonly held job in North America as well as South America, and don't tell me that they can be retrained to be AI engineers or social media influencers instead (some can only be on the road, some only want to be on the road).

        • smj-edison 17 hours ago

          I agree that we have been able to automate a lot of jobs, but it's not like intellectual jobs have completely replaced physical labor. Electricians, phlebotomists, linemen, firefighters, caregivers, etc, etc, are jobs that current AI approaches don't even scratch. I mean, Boston dynamics has barely been able to get a robot to walk.

          So no, we don't need to retrain them to be AI engineers if we have an active shortage of electricians and plumbers. Now, perhaps there aren't enough jobs—I haven't looked at exact numbers—but we still have a long ways to go before I think everything is automated.

          Everything being slop seems to be the much more likely issue in my eyes[1].

          [1] https://titotal.substack.com/p/slopworld-2035-the-dangers-of...

      • ToucanLoucan a day ago

        > as long as there's a simultaneous effort to help (especially those who are poor) transition to a new form of working.

        Somehow everyone who says this misses that never in the history of the United States (and most other countries tbh) has this been true.

        We just consign people to the streets in industrial quantity. More underserved to act as the lubricant for capitalism.

    • ToucanLoucan a day ago

      > Those of us who write software professionally are literally in a field premised on automating other people's jobs away.

      Depends what you write. What I work on isn't about eliminating jobs at all, if anything it creates them. And like, actual, good jobs that people would want, not, again, paying someone below the poverty line $5 to deliver an overpriced burrito across town.

      • tptacek a day ago

        I think most of the time when we tell ourselves this, it's cope. Software is automation. "Computers" used to be people! Literally, people.

    • milesrout a day ago

      Automating jobs away is good for workers. Not bad. Don't you start repeating ignorant socialist nonsense. You are better than that.

      • 6747636484 15 hours ago

        > Automating jobs away is good for workers. Not bad.

        Sure, if you completely disregard the past 200 years or so of history.