why don't planets lose their atmospheres to outer space? one would assume that the Boltzmann distribution would always allow a certain fraction of molecules to have kinetic energies large enough to overcome the gravitational potential.

bonus question: wouldn't the same thing apply to other gravitationally bound equilibrated systems such as open clusters, globular clusters and galaxies?

planets do in fact lose their atmospheres slowly to outer space due to the high-energy tail of the Boltzmann-distribution, but slowly means in the case of the Earth: *very* slowly.

ReplyDeletethe Boltzmann-factor of molecules in the Earth's atmosphere is roughly $\exp(-(330 m/s / 11.2 km/s)^2) = \exp(-1150) \simeq 10^{-500}$. each second (which is the time a molecules needs for covering the scale height of the atmosphere at the first cosmic velocity) the Earth loses a fraction $10^{-500}$ of the atmosphere. after 5 billion years the Earth would have lost a fraction $10^{-483}$, almost nothing. this calculation neglects that the mean free path of air molecules is much smaller than the scale height, so the molecules don't just rise in the gravitational potential umimpeded, which makes the leakage even smaller.

the moon is a very different case: the escape velocity is $\sqrt{6\times4}$ smaller, the Boltzmann-factor is 25 times smaller, the scale height 6 times larger. under the same approximations the moon loses in 5 billion years a fraction of $10^{-3}$ of the atmosphere, which is likely a lower limit because if we substitute a slighly higher temperature, the atmosphere is lost rapidly, likewise would solar winds affect a less tighly bound atmosphere.

(this expert-answer is due to Uli Bastian, whom we would like to thank a lot!)

the same does in fact apply to gravitationally bound systems, and is perhaps even more interesting due to the negative specific heat of those objects. the energies of e.g. stars in globular clusters are constantly shuffled by three-body interactions and single stars can acquire just enough energy to leave the cluster. it does that most likely on a parabolic orbit. the cluster is now more tighly bound because the binding energy is spread among fewer stars, so they need to move more rapidly, leading to a stronger reshuffling of energy. in essence, a cluster heats up by evaporation cooling. :) CQW recommends the book on galactic dynamics by Binney and Tremaine for this very counterintuitive idea.

ReplyDeletewe would like to add that you can even switch from negative to positive specific heat... what is in fact needed is a slow removal of energy on a time scale much longer than the dynamical time scale of the cluster, so that the particles have enough time to rearrange themselves in phase space and to stay equilibrated. then you can take care of the "weird" minus-sign in the virial theorem. if one would remove energy really fast, faster than the dynamical time scale, the system would behave normal and show a positive specific heat: removal of energy causes it to cool down.