Comment by jwr
What I find somewhat puzzling is that these machines were used for the "really big problems". We used supercomputers for weather forecasting, finite element simulations, molecular modeling. And we were getting results.
I don't feel we are getting results that are thousands of times better today.
> I don't feel we are getting results that are thousands of times better today.
You are getting results that are way better than thousands of times. You just aren't aware where they are showing up.
To give you a glimpse, the same modelling problems which a couple of decades ago tool days to come up with a crude solution are now being executed within a loop in optimization problems.
You are also seeing multiphysics and coupling problems showing up in mundane applications. We're talking about problems that augment the same modelling problems that a couple of decades ago tool days to solve with double or triple the degrees of freedom.
Without the availability of these supercomputers the size of credit cards, the whole field of computer-aided engineering would not exist.
Also, to boot, there are indeed diminished returns. Increasing computational resources unblocks constraints such as being able to use doubles instead of floats. This means that lowering numerical errors in 3 or 4 decimal places comes for free at the expense of taking around 4 times longer to solve the same problem.
To top things off, do you think the results of two decades ago were possible without employing a great deal of simplifications and crude approximations? As legend has it, the F117 Nighthawk got it's design due to the computational limits of the time. Since then, stealth planes became more performant and with a smoother design. That's what you get when your computational resources are a thousands times better.