Comment by matt-p
To be honest ai datacentres would be a rip and replace to get back to normal datacentre density, at least on the cooling and power systems.
Maybe useful for some kind of manufacturing or industrial process.
To be honest ai datacentres would be a rip and replace to get back to normal datacentre density, at least on the cooling and power systems.
Maybe useful for some kind of manufacturing or industrial process.
The GPUs, sure. The mainboards and CPUs can be used in clusters for general-purpose computing, which is still more prevalent in most scientific research as far as I am aware. My alma mater has a several-thousand-core cluster that any student can request time on as long as they have reason to do so, and it's all CPU compute. Getting non-CS majors to write GPU code is unlikely in that scenario.
I provide infrastructure for such a cluster that is also available to anyone at the university free of charge. Every year we swap out the oldest 20% of the cluster as we run a five year depreciation schedule. In the last three years, we’ve mostly been swapping in GPU resources at a ration of about 3:1. That’s in response to both usage reports and community surveys.
> Getting non-CS majors to write GPU code is unlikely in that scenario.
People mostly use a GPU-enabled liblaplac. Physics, chemistry, biology, and medicine departments can absolutely use the GPUs.
Cheap compute would be a boon for science research.