Comment by pona-a
I don't have a list, but another popular one was this [0]. They trained a one layer attention-only transformer and could extract its weights as bigrams and skip-trigrams ("A… B C").
[0] https://transformer-circuits.pub/2021/framework/index.html