Comment by TeMPOraL
> I don't think you're going to find an LLM with a large enough context window to have a meaningfully involving story spanning multiple sessions.
Sure you will.
> An LLM isn't going to craft a story element tailored to a character, or more importantly, an individual player.
Sure it is.
> An LLM also doesn't really understand the game rules and isn't going to be able to adjudicate house rules based on fun factor.
Sure it will.
You need to use the tools for their purpose, not for the opposite of it. LLMs have finite context, you need to manage it. LLMs don't have a built-in loop, you need to supply it.
Character stats, names, details about players - those are inputs, and structured ones at that. LLMs shouldn't store them - that's what storage media are for, whether in-memory or a database or a piece of paper. Nor should they manipulate them directly - that's what game systems are for, whether implemented in code or in a rulebook run on a human DM. LLMs are to make decisions - local, intuitive decisions, based on what is in their context. That could be deciding what a character says in a given situation. Or how to continue the story based on worldbuilding database. Or how to update the worldbuilding database based on what it just added to the story. Etc.
> Character stats, names, details about players - those are inputs, and structured ones at that.
Some details about players are structured and can be easily stored and referenced. Some aren't. Consider a character who, through emergent gameplay, develops a slight bias against kobolds; who's going to pick up on that and store it in a database (and at what point)? What if a player extemporaneously gives a monologue about their grief at losing a parent? Will the entire story be stored? Will it be processed into structured chunks to be referenced later? Will the LLM just shove "lost a father" into a database?
Given current limitations I don't see how you design a system that won't forget important details, particularly across many sessions.