Comment by 9rx

Comment by 9rx 18 hours ago

2 replies

> I can have an LLM generate me code-like text all day long, but that doesn't means it's building a system.

I don't follow. An LLM doesn't magically output code-like text, or anything else for that matter, on a whim. You have to build a system that describes your intent to the machine. Only then might it output code-like text, if that's what your system describes. It's not the execution of your code that makes you an engineer. It's building a system that can be executed in the first place that makes you a (software) engineer.

QuercusMax 18 hours ago

You said a few comments upthread (and I quote): "What else can you do with code (and LLMs; same thing) other than build systems?"

There are many things you can do with LLMs other than build systems. That's my point. Using an LLM doesn't make you an engineer; that's preposterous.

  • 9rx 18 hours ago

    > There are many things you can do with LLMs other than build systems.

    Like what? The code-like output example clearly requires you to build a system before the LLM can evaluate your program. If I want an LLM to generate a bedtime story, I also need to build a system that defines that. Where do you find the escape?

    Maybe you're thinking of AGI? While everyone has their own pet AGI definition, many see it as being the point where you no longer have to build the system and the machine can start to take on that role. But we don't have that yet, and isn't what we're talking about anyway.