Comment by whatever1
I think even with the trashy api and drivers if they release graphic cards with 4x the memory of the nvidia equivalents the community would put the effort to make them work.
I think even with the trashy api and drivers if they release graphic cards with 4x the memory of the nvidia equivalents the community would put the effort to make them work.
This: provide cards with extra larger VRAM pools than the competition - to provide a real edge in LLM inferencing - and the users will come.
This happened with bitcoin.
Yeah. Easier said than done, I know, but they need to not just catch up to nVidia but leapfrog them somehow.
I would have said that releasing cards with 32GB+ of onboard RAM, or better yet 128GB, would have gotten things moving. They'd be able to run/train models that nVidia's consumer cards couldn't.
But I think nVidia closed that gap with their "Project Digits" (or whatever the final name is) PCs.