Comment by pona-a
You might be interested in reading the literature on Knowledge Tracing, which is a superset of spaced repetition, aiming to predict what the student knows from their answers.
The simplest approach is Bayesian Knowledge Tracing [0], which is just a simple probability update and an Expectation Maximization optimizer to fit the latent factors. The standard version assumes an independent set of skills with no forgetting, but there are extensions for that [1] [2]. PyBKT [3] implements some common ones, so take a look there.
Learning Factor Analysis [4] seems to work considerably better than naive BKT, while being very simple to implement (it's a logistic model, sharing the family with most spaced repetition algorithms), so that might be promising if the hierarchical dependencies are incorporated. Some researchers have been applying increasingly complex NNs [5], including transformers, but personally I think that's just more parameters to overfit [6].
I'm not an expert. I only built a basic KT tool to prepare for the national Ukrainian exam and found these along the way.
[0] https://www.cs.williams.edu/~iris/res/bkt-balloon/index.html
[1] https://link.springer.com/chapter/10.1007/978-3-319-07221-0_...
[2] https://educationaldatamining.org/EDM2011/wp-content/uploads...
[3] https://github.com/CAHLR/pyBKT
[4] https://arxiv.org/pdf/2105.15106v4
[5] https://pykt-toolkit.readthedocs.io/en/latest/models.html