borracciaBlu 42 minutes ago

I was writing a small article about [Set, Set Builder Notation, and Set Comprehension](https://adropincalm.com/blog/set-set-builder-natatio-set-com...) and while i was investigating it surprised me how many different ways are to describe the same thing. Eg: see all the notation of a Set or a Tuple.

One last rant point is that you don't have "the manual" of math in the very same way you would go on your programming language man page and so there is no single source of truth.

Everybody assumes...

  • BlackFingolfin 29 minutes ago

    I find it strange to compare "math" with one programming language. Mathematics is a huge and diverse field, with many subcommunities and hence also differing notation.

    Your rant would be akin to this if the sides are reversed: "It's surprising how many different ways there are to describe the same thing. Eg: see all the notations for dictionaries (hash tables? associative arrays? maps?) or lists (vectors? arrays?).

    You don't have "the manual" of programming languages. "

johngossman 33 minutes ago

Mathematics is such an old field, older than anything except arguably philosophy, that it's too broad and deep for anyone to really understand everything. Even in graduate school I often took classes in things discovered by Gauss or Euler centuries before. A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old. So, you end up having to spend years specializing and then struggle to find other with the same background.

All of which is compounded by the desire to provide minimal "proofs from the book" and leave out the intuitions behind them.

  • ekjhgkejhgk 30 minutes ago

    > A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old.

    Do you know the reason for that? The reason is that those problems are open and easy to understand. For the rest of open problems, you need to be need expertise to even understand the problem statement.

  • Davidzheng 19 minutes ago

    actually a lot of minimal proof expose more intuition than older proofs people find at first. I find it usually not extremely enlightening reading the first proofs of results, counterintuitively.

  • scotty79 27 minutes ago

    > Mathematics is such an old field, older than anything except arguably philosophy

    If we are already venturing outside of scientific realm with philosophy, I'm sure fields of literature or politics are older. Especially since philosophy is just a subset of literature.

    • saithound 11 minutes ago

      > I'm sure fields of literature or politics are older.

      As far as anybody can tell, mathematics is way older than literature.

      The oldest known proper accounting tokens are from 7000ish BCE, and show proper understanding of addition and multiplication.

      The people who made the Ishango bone 25k years ago were probably aware of at least rudimentary addition.

      The earliest writings are from the 3000s BCE, and are purely administrative. Literature, by definition, appeared later than writing.

geomark 43 minutes ago

I thought we were well past trying to understand mathematics. After all, John von Neumann long ago said "In mathematics we don't understand things. We just get used to them."

  • agumonkey a minute ago

    It's probably a neurological artefact. When the brain just spent enough time looking at a pattern it can suddenly become obvious. You can go from blind to enlightened without the usual conscious logical effort. It's very odd.

  • ekidd 10 minutes ago

    Many ideas in math are extremely simple at heart. Some very precise definitions, maybe a clever theorem. The hard part is often: Why is this result important? How does this result generalize things I already knew? What are some concrete examples of this idea? Why are the definitions they way they are, and not something slightly different?

    To use an example from functional programming, I could say:

    - "A monad is basically a generalization of a parameterized container type that supports flatMap and newFromSingleValue."

    - "A monad is a generalized list comprehension."

    - Or, famously, "A monad is just a monoid in the category of endofunctors, what's the problem?"

    The basic idea, once you get it, is trivial. But the context, the familiarity, the basic examples, and the relationships to other ideas take a while to sink in. And once they do, you ask "That's it?"

    So the process of understanding monads usually isn't some sudden flash of insight, because there's barely anything there. It's more a situation where you work with the idea long enough and you see it in a few contexts, and all the connections become familiar.

    (I have a long-term project to understand one of the basic things in category theory, "adjoint functors." I can read the definition just fine. But I need to find more examples that relate to things I already care about, and I need to learn why that particular abstraction is a particularly useful one. Someday, I presume I'll look at it and think, "Oh, yeah. That thing. It's why interesting things X, Y and Z are all the same thing under the hood." Everything else in category theory has been useful up until this point, so maybe this will be useful, too?)

  • ekjhgkejhgk 37 minutes ago

    Just because someone said it doesn't mean we all agree with it, fortunately.

    You know the meme with the normal distribution where the far right and the far left reach the same conclusion for different reasons, and the ones in the middle have a completely different opinion?

    So on the far right you have people on von Neumann who says "In mathematics we don't understand things". On the far left you have people like you who say "me no mats". Then in the middle you have people like me, who say "maths is interesting, let me do something I enjoy".

    • geomark 32 minutes ago

      Of course. I just find it hilarious that someone like von Neumann would say that.

      • ekjhgkejhgk 29 minutes ago

        von Neumann liked saying things that he knew would have an effect like "so deep" and "he's so smart". Like when asked how he knew the answer, claiming that he did the sum in his head when undoutedly he knew the closed-form expression.

        • srean 9 minutes ago

          I have tingling suspicion that you might have missed the joke.

          To date I have not met anyone who thought he summed the terms of the infinite series in geometric series term by term. That would take infinite time. Of course he used the expression for the sum of a geometric series.

          The joke is that he missed a clever solution that does not require setting up the series, recognising it's in geometric progression and then using the closed form.

          The clever solution just finds the time needed for the trains to collide, then multiply that with the birds speed. No series needed.