Comment by geomark
I thought we were well past trying to understand mathematics. After all, John von Neumann long ago said "In mathematics we don't understand things. We just get used to them."
I thought we were well past trying to understand mathematics. After all, John von Neumann long ago said "In mathematics we don't understand things. We just get used to them."
Just because someone said it doesn't mean we all agree with it, fortunately.
You know the meme with the normal distribution where the far right and the far left reach the same conclusion for different reasons, and the ones in the middle have a completely different opinion?
So on the far right you have people on von Neumann who says "In mathematics we don't understand things". On the far left you have people like you who say "me no mats". Then in the middle you have people like me, who say "maths is interesting, let me do something I enjoy".
von Neumann liked saying things that he knew would have an effect like "so deep" and "he's so smart". Like when asked how he knew the answer, claiming that he did the sum in his head when undoutedly he knew the closed-form expression.
I have tingling suspicion that you might have missed the joke.
To date I have not met anyone who thought he summed the terms of the infinite series in geometric series term by term. That would take infinite time. Of course he used the expression for the sum of a geometric series.
The joke is that he missed a clever solution that does not require setting up the series, recognising it's in geometric progression and then using the closed form.
The clever solution just finds the time needed for the trains to collide, then multiply that with the birds speed. No series needed.
Ah. I was going by memory, and I had those two as separate stories. I didn't remember that he said "I did the sum" on the trains problem.
Many ideas in math are extremely simple at heart. Some very precise definitions, maybe a clever theorem. The hard part is often: Why is this result important? How does this result generalize things I already knew? What are some concrete examples of this idea? Why are the definitions they way they are, and not something slightly different?
To use an example from functional programming, I could say:
- "A monad is basically a generalization of a parameterized container type that supports flatMap and newFromSingleValue."
- "A monad is a generalized list comprehension."
- Or, famously, "A monad is just a monoid in the category of endofunctors, what's the problem?"
The basic idea, once you get it, is trivial. But the context, the familiarity, the basic examples, and the relationships to other ideas take a while to sink in. And once they do, you ask "That's it?"
So the process of understanding monads usually isn't some sudden flash of insight, because there's barely anything there. It's more a situation where you work with the idea long enough and you see it in a few contexts, and all the connections become familiar.
(I have a long-term project to understand one of the basic things in category theory, "adjoint functors." I can read the definition just fine. But I need to find more examples that relate to things I already care about, and I need to learn why that particular abstraction is a particularly useful one. Someday, I presume I'll look at it and think, "Oh, yeah. That thing. It's why interesting things X, Y and Z are all the same thing under the hood." Everything else in category theory has been useful up until this point, so maybe this will be useful, too?)