jihadjihad 3 days ago

> Garman is also not keen on another idea about AI – measuring its value by what percentage of code it contributes at an organization.

You really want to believe, maybe even need to believe, that anyone who comes up with this idea in their head has never written a single line of code in their life.

It is on its face absurd. And yet I don't doubt for a second that Garman et al. have to fend off legions of hacks who froth at the mouth over this kind of thing.

  • Buttons840 3 days ago

    Time to apply the best analogy I've ever heard.

    > "Measuring software productivity by lines of code is like measuring progress on an airplane by how much it weighs." -- Bill Gates

    Do we reward the employee who has added the most weight? Do we celebrate when the AI has added a lot of weight?

    At first, it seems like, no, we shouldn't, but actually, it depends. If a person or AI is adding a lot of weight, but it is really important weight, like the engines or the main structure of the plane, then yeah, even though it adds a lot of weight, it's still doing genuinely impressive work. A heavy airplane is more impressive than a light weight one (usually).

    • subhro 3 days ago

      I just can’t resist myself when airplanes come up in discussion.

      I completely understand your analogy and you are right. However just to nitpick, it is actually super important to have a weight on the airplane at the right place. You have to make sure that your aeroplane does not become tail heavy or it is not recoverable from a stall. Also a heavier aeroplane, within its gross weight, is actually safer as the safe manoeuverable speed increases with weight.

      • fny 3 days ago

        I think this makes the analogy even more apt.

        If someone adds more code to the wrong places for the sake of adding more code, the software may not be recoverable for future changes or from bugs. You also often need to add code in the right places for robustness.

      • dahart 3 days ago

        > a heavier aeroplane … is actually safer

        Just to nitpick your nitpick, that’s only true up to a point, and the range of safe weights isn’t all that big really - max payload on most planes is a fraction of the empty weight. And planes can be overweight, reducing weight is a good thing and perhaps needed far more often than adding weight is needed. The point of the analogy was that over a certain weight, the plane doesn’t fly at all. If progress on a plane is safety, stability, or speed, we can measure those things directly. If weight distribution is important to those, that’s great we can measure weight and distribution in service of stability, but weight isn’t the primary thing we use.

        Like with airplane weight, you absolutely need some code to get something done, and sometimes more is better. But is more better as a rule? Absolutely not.

      • RugnirViking 3 days ago

        right, thats why its a great analogy - because you also need to have at least some code in a successful piece of software. But simply measuring by the amount of code leads to weird and perverse incentives - code added without thought is not good, and too much code can itself be a problem. Of course, the literal balancing aspect isn't as important.

      • scarier 3 days ago

        This is a pretty narrow take on aviation safety. A heavier airplane has a higher stall speed, more energy for the brakes to dissipate, longer takeoff/landing distances, a worse climb rate… I’ll happily sacrifice maneuvering speed for better takeoff/landing/climb performance.

      • vdqtp3 3 days ago

        > the safe manoeuverable speed increases with weight

        The reason this is true is because at a higher weight, you'll stall at max deflection before you can put enough stress on the airframe to be a problem. That is to say, at a given speed a heavier airplane will fall out of the air [hyperbole, it will merely stall - significantly reduced lift] before it can rip the wings/elevator off [hyperbole - damage the airframe]. That makes it questionable whether heavier is safer - just changes the failure mode.

    • fbd_0100 3 days ago

      Progress on airplanes is often tracked by # of engineering drawings released, which means that 1000s of little clips, brackets, fittings, etc. can sometimes misrepresent the amount of engineering work that has taken place compared to preparing a giant monolithic bulkhead or spar for release. I have actually proposed measuring progress by part weight instead of count to my PMs for this reason

    • stockresearcher 3 days ago

      > the best analogy I've ever heard.

      It’s an analogy that gets the job done and is targeted at non-tech managers.

      It’s not perfect. Dead code has no “weight” unless you’re in a heavily storage-constrained environment. But 10,000 unnecessary rivets has an effect on the airplane everywhere, all the time.

      • runako 3 days ago

        > Dead code has no “weight”

        Assuming it is truly dead and not executable (which someone would have to verify is & remains the case), dead code exerts a pressure on every human engineer who has to read (around) it, determine that it is still dead, etc. It also creates risk that it will be inadvertently activated and create e.g. security exposure.

      • wat10000 3 days ago

        In this analogy, I'd say dead code corresponds to airplane parts that aren't actually installed on the aircraft. When people talk about the folly of measuring productivity in lines of code, they aren't referring to the uselessness of dead code, they're referring to the harms that come from live code that's way bigger than it needs to be.

      • 1718627440 3 days ago

        When you are thinking of development and refactoring, dead code absolutely has weight.

  • shit_game 3 days ago

    This reminds me of a piece on folklore.org by Andy Hertzfeld[0], regarding Bill Atkinson. A "KPI" was introduced at Apple in which engineers were required to report how many lines of code they had written over the week. Bill (allegedly) claimed "-2000" (a completely, astonishingly negative report), and supposedly the managers reconsidered the validity of the "KPI" and stopped using it.

    I don't know how true this is in fact, but I do know how true this is in my work - you cannot apply some arbitrary "make the number bigger" goal to everything and expect it to improve anything. It feels a bit weird seeing "write more lines of code" becoming a key metric again. It never worked, and is damn-near provably never going to work. The value of source code is not in any way tied to its quantity, but value still proves hard to quantify, 40 years later.

    0. https://www.folklore.org/Negative_2000_Lines_Of_Code.html

    • wat10000 3 days ago

      Goodhart's law: when a measure becomes a target, it ceases to be a good measure.

  • rhplus 3 days ago

    Given the way that a lot of AI coding actually works, it’s like asking what percent of code was written by hitting tab to autocomplete (intellisense) or what percent of a document benefited from spellcheck.

    • SV_BubbleTime 3 days ago

      While most of us know the next word guessing is how it works in reality…

      That sentiment ignores the magic of how well this works. There are mind blowing moments using AI coding, to pretend that it’s “just auto correct and tab complete” is just as deceiving as “you can vibe code complete programs”.

  • exasperaited 3 days ago

    All that said, I'm very keen on companies telling me how much of their codebase was written by AI.

    I just won't use that information in quite the excitable, optimistic way they offer it.

    • NoMoreNicksLeft 3 days ago

      I want to have the model re-write patent applications, and if any portion of your patent filing was replicated by it your patent is denied as obvious and derivative.

    • jihadjihad 3 days ago

      "...just raised a $20M Series B and are looking to expand the team and products offered. We are fully bought-in to generative AI — over 40% of our codebase is built and maintained by AI, and we expect this number to continue to grow as the tech evolves and the space matures."

      "What does your availability over the next couple of weeks look like to chat about this opportunity?"

      • jraph 3 days ago

        "Yeah, quite busy over the next couple of weeks actually… the next couple of decades, really - awful how quickly time fills by itself these days, right? I'd have contributed towards lowering that 40% number which seems contrary to your goals anyway. But here's my card, should you need help with debugging something tricky some time in the near future and nobody manages to figure it out internally. I may be able to make room for you if you can afford it. I might be VERY busy though."

  • techpineapple 3 days ago

    Something I wonder about the percent of code - I remember like 5-10 years ago there was a series of articles about Google generating a lot of their code programmatically, I wonder if they just adapted their code gen to AI.

    I bet Google has a lot of tools to say convert a library from one language to another or generate a library based on an API spec. The 30% of code these LLMs are supposedly writing is probably in this camp, not net novel new features.

  • ks2048 3 days ago

    When I see these stats, I think of all the ways "percentage of code" could be defined.

    I ask an AI 4 times to write a method for me. After it keeps failing, I just write it myself. AI wrote 80% of the code!

  • ozgrakkurt 2 days ago

    It is a really attractive idea for lazy people who don’t want to learn things

  • DanielHB 3 days ago

    It is like measuring company output based on stuff done through codegen...

elashri 3 days ago

In academia the research pipeline is this

Undergraduate -> Graduate Student -> Post-doc -> Tenure/Senior

Some exceptions occur for people getting Tenure without post doc or people doing some other things like taking undergraduate in one or two years. But no one expect that we for whole skip the first two and then get any senior researchers.

The same idea applies anywhere, the rule is that if you don't have juniors then you don't get seniors so better prepare your bot to do everything.

tananaev 3 days ago

As always, the truth is somewhere in the middle. AI is not going to replace everyone tomorrow, but I also don't think we can ignore productivity improvements from AI. It's not going to replace engineers completely now or in the near future, but AI will probably reduce the number of engineers needed to solve a problem.

cwoolfe 3 days ago

I'm a technical co-founder rapidly building a software product. I've been coding since 2006. We have every incentive to have AI just build our product. But it can't. I keep trying to get it to...but it can't. Oh, it tries, but the code it writes is often overly complex and overly-verbose. I started out being amazed at the way it could solve problems, but that's because I gave it small, bounded, well-defined problems. But as expectations with agentic coding rose, I gave it more abstract problems and it quickly hit the ceiling. As was said, the engineering task is identifying the problem and decomposing it. I'd love to hear from someone who's used agentic coding with more success. So far I've tried Co-pilot, Windsurf, and Alex sidebar for Xcode projects. The most success I have is via a direct question with details to Gemini in the browser, usually a variant of "write a function to do X"

  • svara 3 days ago

    > As was said, the engineering task is identifying the problem and decomposing it.

    In my experience if you do this and break the problem down into small pieces, the AI can implement the pieces for you.

    It can save a lot of time typing and googling for docs.

    That said, once the result exceeds a certain level of complexity, you can't really ask it to implement changes to existing code anymore, since it stops understanding it.

    At which point you now have to do it yourself, but you know the codebase less well than if you'd hand written it.

    So, my upshot is so far that it works great for small projects and for prototyping, but the gain after a certain level of complexity is probably quite small.

    But then, I've also find quite some value in using it as a code search engine and to answer questions about the code, so maybe if nothing else that would be where the benefit comes from.

    • mavilia 3 days ago

      > At which point you now have to do it yourself, but you know the codebase less well than if you'd hand written it.

      Appreciate you saying this because it is my biggest gripe in these conversations. Even if it makes me faster I now have to put time into reading the code multiple times because I have to internalize it.

      Since the code I merge into production "is still my responsibility" as the HN comments go, then I need to really read and think more deeply about what AI wrote as opposed to reading a teammate's PR code. In my case that is slower than the 20% speedup I get by applying AI to problems.

      I'm sure I can get even more speed if I improve prompts, when I use the AI, agentic vs non-agentic, etc. but I just don't think the ceiling is high enough yet. Plus I am someone who seems more prone to AI making me lazier than others so I just need to schedule when I use it and make that time as minimal as possible.

scotty79 3 days ago

Are we trying to guilt trip corporations to do socially responsible thing regarding young workers skill acquisition?

Haven't we learned that it almost always ends up in hollow PR and marketing theater?

Basically the solution to this is extending education so that people entering workforce are already at senior level. Of course this can't be financed by the students, because their careers get shortened by longer education. So we need higher taxes on the entities that reap the new spoils. Namely those corporations that now can pass on hiring junior employees.

fabioyy 2 days ago

I do not agreed. Its was not even worth without llm. Junior will always take a LOT of time from seniors. and when the junior become good enough, he will find another job. and the senior will be stuck in this loop.

junior + llm, it even worse. they become prompt engineers

EternalFury 3 days ago

I was going to say something, then I realized my cynicism is already at maximum.

subhashp 2 days ago

Makes sense. Instead of replacing junior staff, they should be trained to use AI to get more done in less time. In next 2-3 years they will be experts doing good work with high productivity.

lenerdenator 3 days ago

Junior staff will be necessary but you'll have to defend them from the bean-counters.

You need people who can validate LLM-generated code. It takes people with testing and architecture expertise to do so. You only get those things by having humans get expertise through experience.

lbrito 3 days ago

>teach “how do you think and how do you decompose problems”

That's rich coming from AWS!

I think he meant "how do you think about adding unnecessary complexity to problems such that it can enable the maximum amount of meetings, design docs and promo packages for years to come"!

VagabundoP 2 days ago

Two things that will hurt us in the long run, working from home and AI. I'm generally in favour of both, but with newbies it hurts them as they are not spending enough face to face time with seniors to learn on the job.

And AI will hurt them in their own development and with it taking over the tasks they would normally cut their teeth on.

We'll have to find newer ways of helping the younger generation get in the door.

  • xz0r 2 days ago

    A weekly 1 hour call, where pair programming/ exploration of an on-going issue, technical idea would be enough to replace face to face time with seniors. This has been working great for us, at a multi billion dollar profitable public company thats been fully remote.

  • kisamoto 2 days ago

    I would argue that just being in the office or not using AI doesn't guarantee any better learning of younger generations. Without proper guidance a junior would still struggle regardless of their location or AI pilot.

    The challenge now is for companies, managers and mentors to adapt to more remote and AI assisted learning. If a junior can be taught that it's okay to reach out (and be given ample opportunities to do so), as well as how to productively use AI to explain concepts that they may feel too scared to ask because they're "basics", then I don't see why this would hurt in the long run.

Martin_Silenus 3 days ago

I can't wait for that damn bubble to explode, really...

This is becoming unbreathable for hackers.

  • LinuxAmbulance 3 days ago

    It's already exploding.

    The hype train is going to keep on moving for a while yet though.

dcchambers 3 days ago

A lot of companies that have stopped hiring junior employees are going to be really hurting in a couple of years, once all of their seniors have left and they have no replacements trained and ready to go.

stopandth1nk 3 days ago

If AI is so great and had PhD level skills (Musk) then logic says you should be replacing all of your _senior_ developers. That is not the conclusion they reached which implies that the coding ability is not that hot. Q.E.D.

thallavajhula 3 days ago

Finally someone from a top position said this. After all the trash the CEOs have been spewing and sensationalizing every AI improvement, for a change, a person in a non-engineering role speaks the truth.

tehjoker 3 days ago

Unfortunately, this is the kind of view that is at once completely correct and anathema to private equity because they can squeeze a next quarter return by firing a chunk of the labor force.

newsclues 2 days ago

Rather than AI that can function as many junior coders to enable a senior programmer to be more efficient.

Having AI function as a senior programmer for lots of junior programmers that helps them learn and limits the interruptions for human senior coders makes so much more sense.

demirbey05 3 days ago

Yesterday, I was asked to scrape data from a website. My friend used ChatGPT to scrape data but didn't succeded even spent 3h+. I looked website code and understand with my web knowledge and do some research with LLM. Then I described how to scrape data to LLM it took 30 minutes overall. The LLM cant create best way but you can create with using LLM. Everything is same, at the end of the day you need someone who can really think.

  • jvm___ 3 days ago

    LLM's can do anything, but the decision tree for what you can do in life is almost infinite. LLM's still need a coherent designer to make progress towards a goal.

    • demirbey05 3 days ago

      LLMs can do small things well, but you must use small parts to form big picture.

  • isatty 3 days ago

    Or you could’ve used xpath and bs4 and have been done in an hour or two and have more understandable code.

    • demirbey05 3 days ago

      it is not that easy, there is lazy loading in the page that is triggered by scroll of specific sections. You need to find clever way, no way to scrape with bs4, so tough with even selenium.

farceSpherule 3 days ago

Bravo.. Finally a voice of reason.

As someone who works in AI, any CEO who says that AI is going to replace junior workers has no f*cking clue what they are talking about.

[removed] 2 days ago
[deleted]
lvl155 3 days ago

Current generation of AI agents are great at writing a block of codes. Similar to writing a great paragraph. Know your tools.

segmondy 3 days ago

AWS CEO says what he has to say to push his own agenda and obviously to align himself with the most currently popular view.

mhh__ 3 days ago

AWS is a very infrastructure intensive project with extremely tight SLAs, and no UI, makes a lot of sense.

the_arun 3 days ago

Point is nobody has figured out how much AI can replace humans. People. There is so much of hype out there as every tech celebrity sharing their opinions without responsibility of owning them. We have to wait & see. We could change courses when we know the reality. Until then, do what we know well.

tomrod 3 days ago

My respect for people that take this approach is very high. This is the right way to approach integration of technology.

Can SOME people's jobs be replaced by AI. Maybe on paper. But there are tons of tradeoffs to START with that approach and assume fidelity of outcome.

butterisgood 2 days ago

Agree. AI is, so far, almost as good as StackOverflow, except it lies confidently and generates questionable code.

alecco 3 days ago

Perhaps I'm too cynical about messages coming out of FAANG. But I have a feeling they are saying things to placate the rising anger over mass layoffs, h1b abuse, and offshoring. I hope I'm wrong.

senko 3 days ago

Did a double take at Berman bring described as an AI investor. He does invest but a more appropriate description would be "AI YouTuber".

I don't mean that as a negative, he's doing great work explaining AI to (dev) masses!

einpoklum 3 days ago

Simple, just replace the CEO with an LLM and it will be singing a different tune :-P

Sparkyte 3 days ago

It is too late it is already happening. The evolution of tech field is people being more experienced and not AI. But AI will be there for questions and easy one liners. Properly formalized documentation, even TLDRs.

nlawalker 3 days ago

The cost of not hiring and training juniors is trying to retain your seniors while continuously resetting expectations with them about how they are the only human accountable for more and more stuff.

rvz 3 days ago

"AGI" always has been a narrative scam after late 2022.

mvkel 3 days ago

Agreed.

LLMs are actually -the worst- at doing very specific repetitive things. It'd be much more appropriate for one to replace the CEO (the generalist) rather than junior staff.

Borg3 2 days ago

Of corse it is... You should replace your senior staff with AI ;) Juniors will just prompt it then....

oblio 3 days ago

I heard from several sources that AWS has a mandate to put GenAI in everything and force everyone to use it so... yeah.

m3kw9 2 days ago

Junior is just the lower rank, you will still have a lower rank, fine, do t call it junior

geodel 3 days ago

Maybe source of "AI replacing junior staff" is the statement AWS CEO made during a private meeting with client.

hazek112 3 days ago

Well yeah, they're just doing this with H1B's and OPT. The other kind of "AI".

mattmaroon 3 days ago

Why do so many people act like they’re mutually exclusive? Junior staff can use AI too.

eurekin 3 days ago

Not often article makes me want to buy shares. Like... never. This article is spot on.

loeg 3 days ago

Sometimes, the C suite is there because they're actually good at their jobs.

failiaf 3 days ago

junior engineers aren't hired to get tons of work done; they're hired to learn, grow, and eventually become senior engineers. ai can't replace that, but only help it happen faster (in theory anyway).

mikert89 3 days ago

This is just to walk back previous statements by Andy Jessy. Political theater

ramesh31 3 days ago

No one's getting replaced, but you may not hire that new person that otherwise would have been needed. Five years ago, you would have hired a junior to crank out UI components, or well specc'd CRUD endpoints for some big new feature initiative. Now you probably won't.

  • aksnsman 3 days ago

    > well specc'd CRUD endpoints

    I’m really tired of this trope. I’ve spent my whole career on “boring CRUD” and the number of relational db backed apps I’ve seen written by devs who’ve never heard of isolation levels is concerning (including myself for a time).

    Coincidentally, as soon as these apps see any scale issues pop up.

  • sharperguy 3 days ago

    On the other hand, that extra money can be used to expand the business in other ways, plus most kids coming out of college these days are going to be experts in getting jobs done with AI (although they will need a lot of training in writing actual secure and maintainable code).

    • ninetyninenine 3 days ago

      Even the highest ranking engineers should be experts. I don’t understand why there’s this focus on juniors as the people who know AI best.

      Using AI isn’t rocket science. Like you’re talking about using AI as if typing a prompt in English is some kind of hard to learn skill. Do you know English? Check. Can you give instructions? Check. Can you clarify instructions? Check.

      • LinuxAmbulance 3 days ago

        > I don’t understand why there’s this focus on juniors as the people who know AI best.

        Because junior engineers have no problem with wholeheartedly embracing AI - they don't have enough experience to know what doesn't work yet.

        In my personal experience, engineers who have experience are much more hesitant to embrace AI and learn everything about it, because they've seen that there are no magic bullets out there. Or they're just set in their ways.

        To management that's AI obsessed, they want those juniors over anyone that would say "Maybe AI isn't everything it's cracked up to be." And it really, really helps that junior engineers are the cheapest to hire.

    • SV_BubbleTime 3 days ago

      > plus most kids coming out of college these days are going to be experts in getting jobs done with AI

      “You won’t lose your job to AI, you’ll lose it to someone who uses AI better than you do”

  • blackhaz 3 days ago

    Sure. First line tech support as well. In many situations customers will get vastly superior service if AI agent answers the call.

    At least in my personal case, struggling with renewal at Virgin Broadband, multiple humans wasted probably an hour of everyone's time overall on the phone bouncing me around departments, unable to comprehend my request, trying to upsell and pitch irrelevant services, applying contextually inappropriate talking scripts while never approaching what I was asking them in the first place. Giving up on those brainless meat bags and engaging with their chat bot, I was able to resolve what I needed in 10 minutes.

    • kamaal 3 days ago

      Its strange you have to write this.

      In India most of the banks now have apps that do nearly all the banking you can do by visiting a branch personally. To that extent this future is already here.

      When I had to close my loan and had to visit a branch nearly a few times, the manager tells me, significant portion of his people's time now goes into actual banking- which according to him was selling products(fixed deposits, insurances, credit cards) and not customer support(which the bank thinks is not its job and has to because there is no other alternative to it currently).

    • firesteelrain 3 days ago

      > Sure. First line tech support as well. In many situations customers will get vastly superior service if AI agent answers the call.

      In IT, if at a minimum, AI would triage the problem intelligently (and not sound like a bot while doing it), that would save my more expensive engineers a lot more time.

    • supriyo-biswas 3 days ago

      This is mostly because CS folks are given such sales and retention targets; and while I’ve never encountered a helpful support bot even in the age of LLMs, I presume in your case the company management was just happy to have a support bot talking to people without said metrics.

    • Xunjin 3 days ago

      “brainless meat bags” have you ever thought they are instructed to do so to achieve product selling quotas?

      • wkat4242 3 days ago

        Anyone who blindly follows orders is a brainless meat bag too.

nektro 2 days ago

so refreshing to see this view from someone in a position high up like his

catigula 3 days ago

No it isn't, he's lying. Sorry guys.

Claude code is better than a junior programmer by a lot and these guys think it only gets better from there and they have people with decades in the industry to burn through before they have to worry about retraining a new crop.

htrp 3 days ago

> “My view is you absolutely want to keep hiring kids out of college and teaching them the right ways to go build software and decompose problems and think about it, just as much as you ever have.”

Instead you should replace senior staff who make way more.

fredgrott 3 days ago

that was not the kicker....it was the ending comment...

Better learn how to learn as we are not training(or is that paying) you to learn...

jimbo808 2 days ago

Hey at least there's one adult in the room in the big tech sphere.

thefz 2 days ago

LLM defenders with the "yOu cAn't cRiTiCiZe iT WiThOuT MeNtIoNiNg tHe mOdEl aNd vErSiOn, It mUsT Be a lAnGuAgE LiMiTaItOn" crack me up. I used code generation out of curiosity once, for a very simple script, and it fucked it up so badly I was laughing.

Please tell me which software you are building with AI so I can avoid it.

underlipton 3 days ago

They're deliberately popping the bubble. If they'd actually thought this and cared, they'd have said it 2 years ago, before the layoffs.

Stop getting played.

throwaway63467 3 days ago

I mean I used Copilot / JetBrains etc. to work on my code base but for large scale changes it did so much damage that it took me days to fix it and actually slowed me down. These systems are just like juniors in their capabilities, actually worse because junior developers are still people and able to think and interact with you coherently over days or weeks or months, these models aren’t even at that level I think.

charlieyu1 2 days ago

The same CEO that pushed employees back to the office?

ChrisMarshallNY 3 days ago

> “Often times fewer lines of code is way better than more lines of code,” he observed. “So I'm never really sure why that's the exciting metric that people like to brag about.”

I remember someone that had a .sig that I loved (Can't remember where. If he's here, kudos!):

> I hate code, and want as little of it in my programs as possible.

enigma101 3 days ago

finally some common sense from the detached c-suitors

rsynnott 3 days ago

The bubble has, if not burst, at least gotten to the stage where it’s bulging out uncomfortably and losing cohesion.

devmor 3 days ago

It's refreshing to finally see CEOs and other business leaders coming around to what experienced, skeptical engineers have been saying for this entire hype cycle.

I assumed it would happen at some point, but I am relieved that the change in sentiment has started before the bubble pops - maybe this will lesson the economic impact.

  • LinuxAmbulance 3 days ago

    Yeah, the whole AI thing has very unpleasant similarities to the dot com bubble that burst to the massive detriment of the careers of the people that were working back then.

    • devmor 3 days ago

      The parallels in how industry members talk about it is similar as well. No one denies that the internet boom was important and impactful, but it's also undeniable that companies wasted unfathomable amounts of cash for no return at the cost of worker well being.

blitzar 3 days ago

Juniors are cheaper than Ai tokens and easier to fire and hire.

yoyohello13 3 days ago

Finally some fucking sanity from a megacorp CEO. Been a long time.

terminatornet 2 days ago

Sounds like someone got the memo that the bubble's about to burst.

octocop 3 days ago

but wait, what about my Nvidia stocks bro? Can we keep the AI hype going bro. Pls bro just make another AI assistant editor bro.

butbuttbutbut 3 days ago

[flagged]

  • mxhwll 3 days ago

    Time will show you are right and almost all the other dumbasses here at HN are wrong. Which is hardly surprising since they are incapable of applying themselves around their coming replacement.

    They are 100% engineers and 100% engineers have no ability to adapt to other professions. Coding is dead, if you think otherwise then I hope you are right! But I doubt it because so far I have only heard arguments to the counter that are obviously wrong and retarded.

  • isatty 3 days ago

    That’s a shit article by Thomas, people should stop quoting it.