Comment by logicchains

Comment by logicchains 11 hours ago

17 replies

Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions. Quantum mechanics assumes, as the name implies, that reality at the smallest level can be quantised, so it's completely appropriate to apply entropy to describing things at the microscopic scale.

aurareturn 9 hours ago

If we knew the exact state of all particles in an enclosed system, we can calculate what future states will be exactly. No need to calculate possible states.

  • IAmBroom 7 hours ago

    Since that's not possible in any physical system of one or more particles, it's irrelevant.

kgwgk 9 hours ago

> Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions.

There are no probability distributions over possible states when there is perfect knowledge of the state.

> Quantum mechanics

Entropy is also zero for a pure quantum state. You won’t have entropy without imperfect knowledge.

  • whereismyacc 9 hours ago

    > There are no probability distributions over possible states when there is perfect knowledge of the state.

    I know very little about physics but I thought that the leading interpretations of quantum physics say that the probability distribution is all we can know about a system. The entropy is not due to due to a lack of information about the quantum state, but because the outcomes are inherently stochastic?

    • kgwgk 9 hours ago

      Entropy is about the state - not about “outcomes”.

      “All we can know” is the precise state - at least in principle - and entropy is zero in that case.

  • mr_mitm 9 hours ago

    Just look at the definition of entropy. Knowledge about a system never enters the equation.

    S := -k_B sum p_i ln (p_i)

    • ajkjk 5 hours ago

      As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

      Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

      The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

      • antonvs an hour ago

        You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

        One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

    • kgwgk 9 hours ago

      p_i

      Edit to add lots of words:

      In the definition of entropy

      S := -k_B sum p_i ln (p_i)

      knowledge about the system enters the equation in the p_i terms.

      The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

      • mr_mitm 9 hours ago

        Please communicate in full sentences with me.

        I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.