Comment by tonyoconnell

Comment by tonyoconnell 10 months ago

2 replies

I understand what you mean but Test First Development by LLM's will solve lots of problems with hallucinations and soon LLM's will be much better at coding than humans. I am always surprised that so many highly intelligent people don't understand this.

marcus_holmes 10 months ago

Have you tried writing tests with an LLM?

Because I have, and it's not been the experience you're describing. The LLM hallucinated the error message it was testing for (that it had itself written 5 minutes earlier, in the file it had been given as the source to test).

I don't think this can be solved with the current methodology we're using to create these assistants. I remain arguably highly intelligent and definitely convinced that LLMs need to evolve more to surpass humans as coders.

spacebacon 10 months ago

Assuming the end goal is to serve humanity. Will a human always be better at using an LLM or will LLM’s eventually be better at using LLM’s to serve humanity?