Comment by abuani

Comment by abuani 5 hours ago

0 replies

Can it though? Everything I've seen and experienced is that LLMs are very good at making it appear to do those things, but the amount of times I've gotten stuck on "you're absolutely right!" When correcting the LLMs suggests that it can not reason by any means, nor does it learn. Otherwise, an LLM would never get stuck in a loop.