Comment by kgwgk

Comment by kgwgk 9 hours ago

14 replies

> Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions.

There are no probability distributions over possible states when there is perfect knowledge of the state.

> Quantum mechanics

Entropy is also zero for a pure quantum state. You won’t have entropy without imperfect knowledge.

whereismyacc 9 hours ago

> There are no probability distributions over possible states when there is perfect knowledge of the state.

I know very little about physics but I thought that the leading interpretations of quantum physics say that the probability distribution is all we can know about a system. The entropy is not due to due to a lack of information about the quantum state, but because the outcomes are inherently stochastic?

  • kgwgk 9 hours ago

    Entropy is about the state - not about “outcomes”.

    “All we can know” is the precise state - at least in principle - and entropy is zero in that case.

mr_mitm 9 hours ago

Just look at the definition of entropy. Knowledge about a system never enters the equation.

S := -k_B sum p_i ln (p_i)

  • ajkjk 5 hours ago

    As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

    Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

    The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

    • antonvs an hour ago

      You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

      One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

  • kgwgk 8 hours ago

    p_i

    Edit to add lots of words:

    In the definition of entropy

    S := -k_B sum p_i ln (p_i)

    knowledge about the system enters the equation in the p_i terms.

    The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

    • mr_mitm 8 hours ago

      Please communicate in full sentences with me.

      I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.

      • kgwgk 8 hours ago

        The same microstate will have different probabilities depending on what are the constraints or measurements used in _your_ description of the system.

        If you choose to describe the system using its microstate - and you know it - there are no probabilities anywhere.

        You can of course know something and choose to ignore it - the entropy is still a reflection of the uncertainty (actual or for the sake of a lower-resolution model).