Comment by notpushkin

Comment by notpushkin 6 months ago

1 reply

This was my first idea as well. Keep training continuously and redeploy clones after each cycle. From a layman perspective this seems reasonable :thinking:

maleldil 6 months ago

You can't realistically keep training the same model forever, or it will start forgetting things it knew before. The proper name for this is "catastrophic forgetting".