Comment by impossiblefork
Comment by impossiblefork 2 days ago
We can't throw in infinite things in the context though.
My impression is that GPT-5 gets confused, not quite right away, but after a couple of pages it has no idea. It doesn't take pages on pages before it forgets things.
I’m currently experimenting with prompts of ~300k tokens for a certain classification task and I think I might be able to make it work. GPT5 chokes but Gemini 2.5 Pro is showing promise. Jury’s still out and I might change my tune in a couple of weeks.