Classical dynamics are deterministic, but we use statistics to describe the physical world because sensitive dependence prevents us from practically measuring a system to a great enough precision for accurate long term modeling. Even if we could do this however, it would be computationally intractable. For these two reasons statistical measures like entropy are valid, but descriptions of their evolution should not be called laws because they they present an inaccurate depiction of the effective causality we observe.
@Ethan: Bah! We use statistics to describe systems because we don’t care about the detailed molecular motions in our internal combustion engines. We just care that they go and we care to be able to predict the rate of going — which is sufficiently handled by StatMech.
Your second point is muddled. We shouldn’t believe the evolution of entropy because all of the relevant physical processes for performing the evolution are time symmetric. Entropy should increase in both time directions. The mysteries are: (1) What is the cause of the apparent entropy/time gradient that we observe in the present epoch? (2) How did it come to be (apparently) so low in the past? (and we mean ooms (orders of magnitude) of ooms low in the past — typical estimates are 10^10^2 -times lower than it could have been) (3) Given the time reversal symmetry, why aren’t collapsing singularities (black holes) producing objects with unbelievably low entropy by symmetry with the expanding singularity (the Big Bang)?
Further, sensitive dependence is not the problem. Noncommuting operators guarantee that we cannot predict a molecular system for even a short time because we can only capture about half of the initial conditions. Then the statistical nature of the interactions guarantee that we can’t meaningfully predict anything other than statistical properties for longer than one free period (~10^-10 seconds for a cold, rarefied gas like the atmosphere). We use statistics because the underlying process is not deterministic.
Computational intractability is not relevant. We can infallibly predict the inexorable bleeding of wavefunctions into larger volumes of the Hilbert configuration space for any ensemble of particles, no matter how large, how imprecisely initially specified, nor how indeterministic the interaction process. The second law, then, comes down to the observation you wish to take for granted — that entropy can be observed to increase implies that the Hilbert space isn’t already uniformly filled — which means that some prior state of the universe only occupied a tiny portion of the configuration space. You’re position is placed on the horns of a dilemma:
* If you accept observable effective causality, they you grant that it is possible to observe something change as time passes — meaning that the configuration space isn’t already uniformly filled — meaning that there is still some way for the entropy to increase, which it inexorably will under the dynamics we have identified.
* If you deny that the evolution operators will eventually fill the configuration space, then you assert that only some portion of that space is reachable — so there will be a time when the uniform fog of wavefunction on the reachable configuration space admits no variation with respect to which an evolution could be identified — and no effective causality can subsequently be observed.
SWNTHT L IEO ‘YPETRA
(beat that for entropy!)
I want this on a shirt.
Claude Shannon would be proud.
Classical dynamics are deterministic, but we use statistics to describe the physical world because sensitive dependence prevents us from practically measuring a system to a great enough precision for accurate long term modeling. Even if we could do this however, it would be computationally intractable. For these two reasons statistical measures like entropy are valid, but descriptions of their evolution should not be called laws because they they present an inaccurate depiction of the effective causality we observe.
@Ethan: Bah! We use statistics to describe systems because we don’t care about the detailed molecular motions in our internal combustion engines. We just care that they go and we care to be able to predict the rate of going — which is sufficiently handled by StatMech.
Your second point is muddled. We shouldn’t believe the evolution of entropy because all of the relevant physical processes for performing the evolution are time symmetric. Entropy should increase in both time directions. The mysteries are: (1) What is the cause of the apparent entropy/time gradient that we observe in the present epoch? (2) How did it come to be (apparently) so low in the past? (and we mean ooms (orders of magnitude) of ooms low in the past — typical estimates are 10^10^2 -times lower than it could have been) (3) Given the time reversal symmetry, why aren’t collapsing singularities (black holes) producing objects with unbelievably low entropy by symmetry with the expanding singularity (the Big Bang)?
Further, sensitive dependence is not the problem. Noncommuting operators guarantee that we cannot predict a molecular system for even a short time because we can only capture about half of the initial conditions. Then the statistical nature of the interactions guarantee that we can’t meaningfully predict anything other than statistical properties for longer than one free period (~10^-10 seconds for a cold, rarefied gas like the atmosphere). We use statistics because the underlying process is not deterministic.
Computational intractability is not relevant. We can infallibly predict the inexorable bleeding of wavefunctions into larger volumes of the Hilbert configuration space for any ensemble of particles, no matter how large, how imprecisely initially specified, nor how indeterministic the interaction process. The second law, then, comes down to the observation you wish to take for granted — that entropy can be observed to increase implies that the Hilbert space isn’t already uniformly filled — which means that some prior state of the universe only occupied a tiny portion of the configuration space. You’re position is placed on the horns of a dilemma:
* If you accept observable effective causality, they you grant that it is possible to observe something change as time passes — meaning that the configuration space isn’t already uniformly filled — meaning that there is still some way for the entropy to increase, which it inexorably will under the dynamics we have identified.
* If you deny that the evolution operators will eventually fill the configuration space, then you assert that only some portion of that space is reachable — so there will be a time when the uniform fog of wavefunction on the reachable configuration space admits no variation with respect to which an evolution could be identified — and no effective causality can subsequently be observed.
To better understand this, I recommend
http://math.ucr.edu/home/baez/week26.html
http://math.ucr.edu/home/baez/week80.html
and all the references therein, especially
http://xxx.lanl.gov/abs/gr-qc/9310022