Comment by skywhopper
Comment by skywhopper 3 hours ago
Which is exactly why we’ve likely reached the peak of LLM capabilities. Everything is poisoned now as training material.
Comment by skywhopper 3 hours ago
Which is exactly why we’ve likely reached the peak of LLM capabilities. Everything is poisoned now as training material.
Eh. Discovering how neurons can be coaxed into memorizing things with almost perfect recall was cool but real AGI or even ASI shouldn't require the sum total of all human generated data to train.