Comment by iLoveOncall

Comment by iLoveOncall 2 days ago

5 replies

Mediocre people produce mediocre work. Using AI might make those mediocre people produce even worse work, but I don't think it'll affect competent people who have standards regardless of the available tooling.

If anything the outcome will be good: mediocre people will produce even worse work and will weed themselves out.

Cause in point: the author of the rebuttal made basic and obvious mistakes that make his work even easier to dismiss and no further paper of his will be considered seriously.

Arainach 2 days ago

>mediocre people will produce even worse work and will weed themselves out.

[[Citation needed]]

I don't believe anyone who has experienced working with other people - in the workspace, in school, whatever - believes that people get weeded out for mediocre output.

  • Muromec 2 days ago

    Weeded out to where anyway? Doing some silly thing, like being a cashier or taxi driver?

  • delusional 2 days ago

    You can also be mediocre in a lot of different ways. Some people are mediocre thinkers, but fantastic hype men. Some people are fantastic at thinking, but suck at playing the political games you have to play in an office. Personally I find that I need some of all of those aspects to have success in a project, the amount varies by the work and external collaborators.

    Intelligence isn't just one measure you can have less or more of. I thought we figured this out 20 years ago.

drsim 2 days ago

I think the pull will be hard to resist even for competent people.

Like the obesity crisis driven by sugar highs, the overall population will be affected, and overall quality will suffer, at least for a while.

bananapub 2 days ago

> Mediocre people produce mediocre work. Using AI might make those mediocre people produce even worse work, but I don't think it'll affect competent people who have standards regardless of the available tooling.

this is clearly not the case, given:

- mass layoffs in the tech industry to force more use of such things - extremely strong pressure from management to use it, rarely framed as "please use this tooling as you see fit" - extremely low quality bars in all sorts of things, e.g. getting your dumb "We wrote a 200 word prompt then stuck that and some web scraped data in to an LLM run by Google/OpenAI/Anthropic" site to the top of hacker news, or most of VC funding in the tech world - extremely large swathes of (at least) the western power structures not giving a shit about doing anything well, e.g. the entire US Federal government leadership now, or the UK government's endless idiocy about "AI Policy development", lawyers getting caught in court having just not even read the documents they put their name on, etc - actual strong desire from many people to outsource their toxic plans to "AI", e.g. the US's machine learning probation or sentencing stuff

I don't think any of us are ready for the tsunami of garbage that's going to be thrown in to every facet of our lives, from government policy to sending people to jail to murdering people with robots to spamming open source projects with useless code and bug reports etc etc etc