Comment by QuadmasterXLII Comment by QuadmasterXLII 6 months ago 3 replies Copy Link View on Hacker News thats just mixture of experts
Copy Link mnky9800n 6 months ago Collapse Comment - i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. Reply View | 2 replies Copy Link QuadmasterXLII 6 months ago Parent Collapse Comment - Well, that depends in whether you keep training it Reply View | 1 reply Copy Link mnky9800n 6 months ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
Copy Link QuadmasterXLII 6 months ago Parent Collapse Comment - Well, that depends in whether you keep training it Reply View | 1 reply Copy Link mnky9800n 6 months ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
Copy Link mnky9800n 6 months ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong.