ajkjk 5 hours ago

As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

  • antonvs an hour ago

    You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

    One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

kgwgk 8 hours ago

p_i

Edit to add lots of words:

In the definition of entropy

S := -k_B sum p_i ln (p_i)

knowledge about the system enters the equation in the p_i terms.

The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

  • mr_mitm 8 hours ago

    Please communicate in full sentences with me.

    I can only guess that your objection is something about probabilities. A microstate has a probability independent of my knowledge of the system just like the probability of having a royal flush doesn't change after drawing five cards. The probability of me ending the game with a royal flush might, but that is not what we mean by these probabilities.

    • kgwgk 8 hours ago

      The same microstate will have different probabilities depending on what are the constraints or measurements used in _your_ description of the system.

      If you choose to describe the system using its microstate - and you know it - there are no probabilities anywhere.

      You can of course know something and choose to ignore it - the entropy is still a reflection of the uncertainty (actual or for the sake of a lower-resolution model).

      • tsimionescu 7 hours ago

        But the point is that, regardless of how you choose to describe or even measure the system, it will need exactly as much heat to raise its temperature by 1 degree (or it will need as much kinetic energy to increase the average velocity of the constituents by the same amount, in the microstate framework). So there is some objective nature to entropy, it's not merely a function of subjective knowledge of a system. Or, to put it another way, two observers with different amounts of information on the microstate of a system will still measure it as having the same entropy.