Comment by hackinthebochs
Comment by hackinthebochs 10 hours ago
Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.
Comment by hackinthebochs 10 hours ago
Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.
Maxwell's demon thought experiment implies they are the same concept. Given a complete knowledge of every particle of gas you can in principle create unphysical low entropy distributions of the particles. This[1] goes into more detail.
[1] https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_...
A fun visual explanation: https://youtu.be/8Uilw9t-syQ?si=D9sR2YAm40SPFG3a
No, it's a representation construct, i.e. how to describe some system in a given basis. The basis can be mathematical. Fourier coefficients for example.
Mathematical notation is a human invention, but the structure that mathematics describes is objective. The choice of basis changes the absolute number of terms, but the relative magnitude of terms for a more or less disordered state is generally fixed outside of degenerate cases.
You're thinking of information entropy, which is not the same concept as entropy in physics. An ice cube in a warm room can be described using a minimum description length as "ice cube in a warm room" (or a crystal structure inside a fluid space), but if you wait until the heat death of the universe, you just have "a warm room" (a smooth fluid space), which will have an even shorter mdl. Von Neuman should never have repurposed the term entropy from physics. Entropy confuses a lot of people, including me.