Comment by meindnoch

Comment by meindnoch 11 hours ago

66 replies

I don't get it.

To me, entropy is not a physical thing, but a measure of our imperfect knowledge about a system. We can only measure the bulk properties of matter, so we've made up a number to quantify how imperfect the bulk properties describe the true microscopic state of the system. But if we had the ability to zoom into the microscopic level, entropy would make no sense.

So I don't see how gravity or any other fundamental physical interaction could follow from entropy. It's a made-up thing by humans.

antonvs 11 hours ago

Your perspective is incorrect.

Physical entropy governs real physical processes. Simple example: why ice melts in a warm room. More subtle example: why cords get tangled up over time.

Our measures of entropy can be seen as a way of summarizing, at a macro level, the state of a system such as that warm room containing ice, or a tangle of cables, but the measure is not the same thing as the phenomenon it describes.

Boltzmann's approach to entropy makes the second law pretty intuitive: there are far more ways for a system to be disordered than ordered, so over time it tends towards higher entropy. That’s why ice melts in a warm room.

  • aeonik 8 hours ago

    My take, for what it's worth,

    Entropy isn’t always the driver of physical change, sometimes it’s just a map.

    Sometimes that map is highly isomorphic to the physical process, like in gas diffusion or smoke dispersion. In those cases, entropy doesn't just describe what happened, it predicts it. The microstates and the probabilities align tightly with what’s physically unfolding. Entropy is the engine.

    But other times, like when ice melts, entropy is a summary, not a cause. The real drivers are bond energies and phase thresholds. Entropy increases, yes, but only because the system overcame physical constraints that entropy alone can’t explain. In this case, entropy is the receipt, not the mechanism.

    So the key idea is this: entropy’s usefulness depends on how well it “sees” the real degrees of freedom that matter. When it aligns closely with the substrate, it feels like a law. When it doesn't, it’s more like coarse bookkeeping after the fact.

    The second law of thermodynamics is most “real” when entropy is the process. Otherwise, it’s a statistical summary of deeper physical causes.

    • lumost 6 hours ago

      What makes entropy interesting is that you can describe many physical processes through analysis of the systems degrees of freedom. This pattern repeats regularly despite the systems being radically different.

      So you can interpret entropy as being about as real as potential energy or newtons laws. Very useful for calculation, subject to evolution laws which are common across all systems - but potentially gives way as an approximation under a finer grained view (although the finer grained view is also subject to the same rules)

  • ludwik 10 hours ago

    > there are far more ways for a system to be disordered than ordered

    I'm a complete layman when it comes to physics, so forgive me if this is naive — but aren't "ordered" and "disordered" concepts tied to human perception or cognition? It always seemed to me that we call something "ordered" when we can find a pattern in it, and "disordered" when we can't. Different people or cultures might be able to recognize patterns in different states. So while I agree that "there are more ways for a system to be disordered than ordered," I would have thought that's a property of how humans perceive the world, not necessarily a fundamental truth about the universe

    • mr_mitm 8 hours ago

      You only hear these terms in layman explanations. Physics has precise definitions for these things. When we say "ordered", we mean that a particular macrostate has only few possible microstates.

      Check this Wikipedia article for a quick overview: https://en.wikipedia.org/wiki/Microstate_(statistical_mechan...

      Details can be found in any textbook on statistical mechanics.

      • Gravityloss 7 hours ago

        Exactly. The coin flipping example is a very nice way to put it. It works since the coins are interchangeable, you just count the number of heads or tails.

        If the coins were of different color and you took that into account, then it wouldn't work.

        It's not intuitive to me what gravity has to do with entropy though, as it's classically just a force and completely reversible (unlike entropy)? Ie if you saw a video of undisturbed objects only affected by gravity, you couldn't tell if the video was reversed.

    • hackinthebochs 10 hours ago

      Think minimum description length. Low entropy states require fewer terms to fully describe than high entropy states. This is an objective property of the system.

      • sat_solver 8 hours ago

        You're thinking of information entropy, which is not the same concept as entropy in physics. An ice cube in a warm room can be described using a minimum description length as "ice cube in a warm room" (or a crystal structure inside a fluid space), but if you wait until the heat death of the universe, you just have "a warm room" (a smooth fluid space), which will have an even shorter mdl. Von Neuman should never have repurposed the term entropy from physics. Entropy confuses a lot of people, including me.

      • zmgsabst 7 hours ago

        “Number of terms” is a human language construct.

      • amelius 10 hours ago

        In a deterministic system you can just use the time as a way to describe a state, if you started from a known state.

  • refactor_master 10 hours ago

    I think original post is confused exactly because of “tangled chords” analogies. Something being “messy” in our daily lives can be a bit subjective, so using the same analogies for natural forces may seem a tad counterintuitive actually.

    Maybe it would be more fitting to say that it just so happens that our human definition of “messy” aligns with entropy, and not that someone decided what messy atoms look like.

    I’d say a bucket of water is more neat than a bucket of ice, macroscopically.

  • geon 8 hours ago

    It has been suggested that time too is derived from entropy. At least the single-directionality of it. That’d make entropy one of the most real phenomena in physics.

  • meindnoch 10 hours ago

    >Simple example: why ice melts in a warm room.

    Ice melting is simply the water molecules gaining enough kinetic energy (from collisions with the surrounding air molecules) that they break the bonds that held them in the ice crystal lattice. But at the microscopic level it's still just water molecules acting according to Newton's laws of motion (forgetting about quantum effects of course).

    Now, back on the topic of the article: consider a system of 2 particles separated by some distance. Do they experience gravity? Of course they do. They start falling towards the midpoint between them. But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

    • tsimionescu 8 hours ago

      > But where is entropy in this picture? How do you even define entropy for a system of 2 particles?

      The answer is that this doesn't happen in a system with only 2 particles. The idea of gravity as an entropic phenomenon is that you introduce some other kind of particle that permeates spacetime, so there is no system that only contains 2 particles. You may use some idea like virtual particles from quantum field theory, or you may define "quanta of space time" as something that is not technically a particle but basically works like one in a handwavy sense.

      But the basic point of these entropy based theories is to explain gravity, and typcilaly spacetime itself, as an emergent result of a collection of numerous objects of some kind. This necessarily means that they don't make sense if applied to idealized systems with very few objects - which is why they typically posit such isolated systems simply can't actually exist in reality.

    • ccozan 9 hours ago

      Let me try to answer. Let's say the particles are experiencing gravity as a natural entropy phenomena. They will attract until they become so close that they are now seen as a single particle. The new system has a lower entropy and a higher gravity than before.

      Explanation seems very rudimentary but that is the gist of the theory.

      From my point of view, I might add the layer of information density. Every quantum fluctuation is an event and the more particles the more information is produced in a defined space volume. But there is no theory of information that is linked to the physics so ...that let me leave as that :).

  • HelloNurse 10 hours ago

    But "disordered" and "ordered" states are just what we define them to be: for example, cords are "tangled" only because we would prefer arrangements of cords with less knots, and knots form because someone didn't handle the cords carefully.

    Physical processes are "real", but entropy is a figment.

    • dekken_ 10 hours ago

      I believe you are correct.

      Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

      Lots of people talk about order/disorder or macro and micro states, not realizing these are things we've invented and aren't physical in nature.

      • kgwgk 8 hours ago

        > Entropy is not a physical quantity, it is a measure of how far a system is from equilibrium.

        That’s funny because the original thermodynamic entropy is defined only for systems in equilibrium.

  • [removed] 10 hours ago
    [deleted]
  • kgwgk 8 hours ago

    > Physical entropy governs real physical processes

    > the measure is not the same thing as the phenomenon it describes.

    There is some tension between those claims.

    The latter seems to support the parent comment’s remark questioning whether a “fundamental physical interaction could follow from entropy”.

    It seems more appropriate to say that entropy follows from the physical interaction - not to be confused with the measure used to describe it.

    One may say that pressure is an entropic force and physical entropy governs the real physical process of gas expanding within a piston.

    However, one may also say that it’s the kinetic energy of the gas molecules what governs the physical process - which arguably is a more fundamental and satisfactory explanation.

prof-dr-ir 10 hours ago

Good question. You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

Nevertheless there is a distinct "reality" to entropic forces, in the sense that it is something that can actually be measured in the lab. If you are not convinced then you can look at:

https://en.wikipedia.org/wiki/Entropic_force

and in particular the example that is always used in a first class on this topic:

https://en.wikipedia.org/wiki/Ideal_chain

So when viewed in this way entropy is not just a "made-up thing", but an effective way to describe observed phenomena. That makes it useful for effective but not fundamental laws of physics. And indeed the wiki page says that entropic forces are an "emergent phenomenon".

Therefore, any reasonable person believing in entropic gravity will automatically call gravity an emergent phenomenon. They must conclude that there is a new, fundamental theory of gravity to be found, and this theory will "restore" the probabilistic interpretation of entropy.

The reason entropic gravity is exciting and exotic is that many other searches for this fundamental theory start with a (more or less) direct quantization of gravity, much like one can quantize classical mechanics to arrive at quantum mechanics. Entropic gravity posits that this is the wrong approach, in the same way that one does not try to directly quantize the ideal gas law.

[0] Let me stress this: there is no entropy without probability distributions, even in physics. Anyone claiming otherwise is stuck in the nineteenth century, perhaps because they learned only thermodynamics but not statistical mechanics.

  • meindnoch 9 hours ago

    Sure, I'm not denying that entropy exists as a concept, that can be used to explain things macroscopically. But like you said, it's origins are statistical. To me, temperature is also a similar "made up" concept. We can only talk about temperature, because a sufficiently large group of particles will converge to a single-parameter distribution with their velocities. A single particle in isolation doesn't have a temperature.

    So if they say gravity might be an entropic effect, does that mean that they assume there's something more fundamental "underneath" spacetime that - in the statistical limit - produces the emergent phenomenon of gravity? So it isn't the entropy of matter that they talk about, but the entropy of something else, like the grains of spacetime of whatever.

    • flufluflufluffy 7 hours ago

      Yes, exactly. The model is based on (in the first approach) a “lattice” of some type of undiscovered particle-like thing (what they refer to as “qubits” in the article, which is unfortunate because it is NOT the same “qubit” from quantum computing) permeating space time. Or maybe more aptly, it is that lattice from which spacetime emerges. And what we observe as the force of gravity emerges from the entropic forces happening in this lattice.

    • spacecadet 8 hours ago

      Im an idiot, let's get that out of the way first. I think that your temperature analogy answered your own question.

      I guess my question in turn is, if we imagine a universe at the end of time(?), one that maybe dominated by a few black holes and not much else. Would an observer experience gravity if place sufficiently far enough way? Or even further, if nothing is left in the universe at all. Assuming that doesn't cause a big crunch, rip, or whatever...

  • simiones 8 hours ago

    > You are absolutely right that entropy is always fundamentally a way to describe are our lack of perfect knowledge of the system [0].

    > [0] Let me stress this: there is no entropy without probability distributions, even in physics.

    The second item doesn't entail the first. Probabilities can be seen as a measure of lack of knowledge about a system, but it isn't necessarily so. A phenomenon can also be inherently/fundamentally probabilistic. For example, wave function collapse is, to the best of our knowledge, an inherently non-deterministic process. This is very relevant to questions about the nature of entropy - especially since we have yet to determine if it's even possible for a large system to be in a non-collapsed state.

    If it turns out that there is some fundamental process that causes wave function collapse even in perfectly isolated quantum systems, then it would be quite likely that entropy is related to such a process, and that it may be more than a measure of our lack of knowledge about the internal state of a system, and instead a measurement of the objective "definiteness" of that state.

    I am aware that objective collapse theories are both unpopular and have some significant hurdles to overcome - but I also think that from a practical perspective, the gap between the largest systems we have been able to observe in pure states versus the smallest systems we could consider measurement devices is still gigantic and leaves us quite a lot of room for speculation.

mjburgess 11 hours ago

Even if we take that view, gravity is still basically a similar case. What we call "gravity" is really an apparent force, that isnt a force at all when seen from a full 4d pov.

Imagine sitting outside the whole universe from t=0,t=end and observing one whole block. Then the trajectories of matter, unaffected by any force at all, are those we call gravitational.

From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

Inertia, on this view, is just a kind of hysteresis the matter distribution of the universe has -- ie., a kind of remembered deformation that persists as the universe evolves.

  • tsimionescu 10 hours ago

    > From this pov, it makes a lot more sense to connect gravity with some orderly or disorderly features of these trajectories.

    On the contrary, entropic gravity works pretty well for the Newtonian view of gravity as a force, and not the GR view of gravity as a deformation of space time and analogous to acceleration. Acceleration is a very elementary concept, one you find even in microscopic descriptions. Gravity being essentially the same thing makes it far more elementary than a concept like entropy, which only applies to large groups of particles.

    So, if the GR picture is the right one, if gravity and acceleration are essentially the same thing, its very hard to see how that aligns with gravity being an emergent phenomenon that only happens at large scales. However, if gravity is just a tendency for massive objects to come together, as in the Newtonian picture, that is perfectly easy to imagine as an entropic effect.

sixo 4 hours ago

This comment thread is exhibit N-thousand that "nobody really understands entropy". My basic understanding goes like this:

In thermodynamics, you describe a system with a massive number of microstates/dynamical variable according to 2-3 measurable macrostate variables. (E.g. `N, V, E` for an ideal gas.)

If you work out the dynamics of those macrostate variables, you will find that (to first order, i.e. in the thermodynamic limit) they depend only on the form of the entropy function of the system `S(E, N, V)`, e.g. Maxwell relations.

If you measured a few more macrostate variables, e.g. the variance in energy `sigma^2(E)` and the center of mass `m`, or anything else, you would be able to write new dynamical relations that depend on a new "entropy" `S(E, N, V, sigma^2(E), m)`. You could add 1000 more variables, or a million—e.g every pixel of an image—basically up until the point where the thermodynamic limit assumptions cease to hold.

The `S` function you'd get will capture the contribution of every-variable-you're-marginalizing-over to the relationships between the remaining variables. This is the sense in which it represents "imperfect knowledge". Entropy dependence arises mathematically in the relationships between macrostate variables—they can only couple to each by way of this function which summarizes all the variables you don't know/aren't measuring/aren't specifying.

That this works is rather surprising! It depends on some assumptions which I cannot remember (on convexity and factorizeabiltiy and things like that), but which apply to most or maybe all equilibrium thermodynamic-scale systems.

For the ideal gas, say, the classical-mechanics, classical-probability, and quantum-mechanic descriptions of the system all reduce to the same `S(N, V, E)` function under this enormous marginalization—the most "zoomed-out" view of their underlying manifold structures turns out to be identical, which is why they all describe the same thing. (It is surprising that seemingly obvious things like the size of the particles would not matter. It turns out that the asymptotic dynamics depend only on the information theory of the available "slots" that energy can go into.)

All of this appears as an artifact of the limiting procedure in the thermodynamic limit, but it may be the case that it's more "real" than this—some hard-to-characterize quantum decoherence may lead to this being not only true in an extraordinarily sharp first-order limit, but actually physically true. I haven't kept up with the field.

No idea how to apply this to gravity though.

IsTom 8 hours ago

If you want to only have one possible past (i.e. can't destroy information) then when you end up in one branch of quantum state you need to "store" enough information to separate you form other branches and you really do need to have multiple possible microstates to differentiate them. If you look post-factum obviously you did end up in a specific state, but statistics do their work otherwise.

logicchains 11 hours ago

Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions. Quantum mechanics assumes, as the name implies, that reality at the smallest level can be quantised, so it's completely appropriate to apply entropy to describing things at the microscopic scale.

  • aurareturn 8 hours ago

    If we knew the exact state of all particles in an enclosed system, we can calculate what future states will be exactly. No need to calculate possible states.

    • IAmBroom 7 hours ago

      Since that's not possible in any physical system of one or more particles, it's irrelevant.

  • kgwgk 9 hours ago

    > Entropy isn't a function of imperfect knowledge. It's a function of the possible states of a system and their probability distributions.

    There are no probability distributions over possible states when there is perfect knowledge of the state.

    > Quantum mechanics

    Entropy is also zero for a pure quantum state. You won’t have entropy without imperfect knowledge.

    • whereismyacc 8 hours ago

      > There are no probability distributions over possible states when there is perfect knowledge of the state.

      I know very little about physics but I thought that the leading interpretations of quantum physics say that the probability distribution is all we can know about a system. The entropy is not due to due to a lack of information about the quantum state, but because the outcomes are inherently stochastic?

      • kgwgk 8 hours ago

        Entropy is about the state - not about “outcomes”.

        “All we can know” is the precise state - at least in principle - and entropy is zero in that case.

    • mr_mitm 8 hours ago

      Just look at the definition of entropy. Knowledge about a system never enters the equation.

      S := -k_B sum p_i ln (p_i)

      • ajkjk 5 hours ago

        As the other replier said, despite your dismissiveness, the knowledge about the system is in the probabilities, so it's right there in the equation.

        Suppose you flip a coin. Before flipping the coin, your knowledge is "heads or tails". After flipping it, your knowledge becomes one of either heads or tails. The amount of information you gained by resolving your imperfect knowledge is the entropy of the distribution.

        The same model works for physical entropy without much modification; the imperfect knowledge is the difference between knowing a macrostate versus the exact microstate.

        • antonvs an hour ago

          You’re glossing over an important point: your knowledge of the future state of the system is “heads or tails”.

          One of the things entropy tells us how a system is likely to evolve in future. But looking at this another way, entropy actually helps dictate how it will evolve in future. And we can prove that mathematically.

      • kgwgk 8 hours ago

        p_i

        Edit to add lots of words:

        In the definition of entropy

        S := -k_B sum p_i ln (p_i)

        knowledge about the system enters the equation in the p_i terms.

        The other term is a constant so it’s not like there are many other choices to link the entropy to the system!

whereismyacc 9 hours ago

It sounds like you're talking about information entropy which to my understanding is analogue to but not the same as entropy in physics?

  • ajkjk 5 hours ago

    It pretty much is the same, except that entropy in physics usually has a constant in front of it.

mjanx123 9 hours ago

Entropy is the opposite of potential

  • echelon 5 hours ago

    Entropy is complicated beyond just a Rankine or Carnot cycle.

    Biology thrives at the ebbs, flows, and eddies of entropy. Predation. Biochemical flux. There are arrows flowing every which way, and systems that keep it finely tuned.

    This theory, based on my surface level reading and understanding, is that the aggregate particle-level entropy within sub light speed systems creates gravity.

bmitc 8 hours ago

> It's a made-up thing by humans.

All of physics is made up by humans.

[removed] 11 hours ago
[deleted]