Comment by QuadmasterXLII Comment by QuadmasterXLII 3 days ago 3 replies Copy Link View on Hacker News thats just mixture of experts
Copy Link mnky9800n 3 days ago Collapse Comment - i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong. Reply View | 2 replies Copy Link QuadmasterXLII 3 days ago Parent Collapse Comment - Well, that depends in whether you keep training it Reply View | 1 reply Copy Link mnky9800n 3 days ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
Copy Link QuadmasterXLII 3 days ago Parent Collapse Comment - Well, that depends in whether you keep training it Reply View | 1 reply Copy Link mnky9800n 3 days ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
Copy Link mnky9800n 3 days ago Root Parent Collapse Comment - perhaps they should always be training and never static. haha. i allegedly grow wiser in my age, why not neural networks? Reply View | 0 replies
i thought mixture of experts didn't update itself with new sets of weights and was just a collection of already trained networks/weights? I could be wrong.