Comment by deadbabe

Comment by deadbabe 11 hours ago

8 replies

This is a little anthropomorphic. The faster option is to tell it to give you the full content of an ideal context for what you’re doing and adjust or expand as necessary. Less back and forth.

7thpower 7 hours ago

It’s not though, one of the key gaps right now is that people do not provide enough direction on the tradeoffs they want to make. Generally LLMs will not ask you about them, they will just go off and build. But if you have them ask, they will often come back with important questions about things you did not specify.

  • MrDunham 3 hours ago

    This is the correct answer. I like to go one step further than the root comment:

    Nearly all of my "agents" are required to ask at least three clarifying questions before they're allowed to do anything (code, write a PRD, write an email newsletter, etc)

    Force it to ask one at a time and it's event better, though not as step-function VS if it went off your initial ask.

    I think the reason is exactly what you state @7thpower: it takes a lot of thinking to really provide enough context and direction to an LLM, especially (in my opinion) because they're so cheap and require no social capital cost (vs asking a colleague / employee—where if you have them work for a week just to throw away all their work it's a very non-zero cost).

    • iaw 2 hours ago

      My routine is:

      Prompt 1: <define task> Do not write any code yet. Ask any questions you need for clarification now.

      Prompt 2: <answer questions> Do not write any code yet. What additional questions do you have?

      Reiterate until questions become unimportant.

  • deadbabe 4 hours ago

    They don’t know what to ask. They only assemble questions according to training data.

    • 7thpower an hour ago

      It seems like you are trying to steer toward a different point or topic.

      In the course of my work, I have found they ask valuable clarifying questions. I don’t care how they do it.

    • fuzzzerd 3 hours ago

      While true, the questions are all points where the LLM would have "assumed" an answer and by asking you get to point in the right direction instead.

manmal 11 hours ago

Can you give me the full content of the ideal context of what you mean here?