Comment by btilly

Comment by btilly 3 days ago

2 replies

I disagree with calling them all failures.

Everything that is known about numerical analysis is based on discreteness, instead of continuity. Every useful predictive model that we have about the world, for example for forecasting weather, depends on numerical analysis.

I consider this a success!

adrian_b 3 days ago

Nope.

Forecasting weather, designing semiconductor devices, or any other such activities are all based on continuous mathematical models, which use e.g. systems of equations with partial derivatives or systems of integral equations.

Only in the last stage of solving such a problem, in order to use a digital computer the continuous mathematical model is approximated by a discrete mathematical model, which is constructed using a standard method, e.g. finite elements, boundary elements, finite differences, most of which are based on approximating an unknown function with an infinity of degrees of freedom with a function determined by a finite number of parameters, then by approximating the operations required to compute those parameters by operations with floating-point numbers.

Such an approximating method is something extremely different from formulating a discrete mathematical model of physics that is considered the exact model.

Even the graphics for a game are based on continuous models of space, not on discrete models, where it would be very difficult to implement something like the rotation of an object or perspective views.

The failures to which I have referred are the attempts to create such discrete models of physics, e.g. where the space and time are discrete not continuous.

These attempts have nothing in common with the approximating techniques, which are indeed the base for most successes in using digital computers.

  • gyrovagueGeist 3 days ago

    Yep! Optimize (solve in infinite dimensions) and then discretize onto a finite basis has typically led to much better and stable methods than a discretize and then optimize approach.

    Time-scale calculus is a pretty niche theoretical field that looks at blending the analysis of difference and differential equations, but I'm not aware of any algorithmic advances based on it.