Page 119 - Physical Chemistry
P. 119

lev38627_ch03.qxd  2/29/08  3:12 PM  Page 100





                100
               Chapter 3                 Equating the thermodynamic  S of (3.56) to the statistical-mechanical  S of (3.55),
               The Second Law of Thermodynamics
                                         we get n R   N k and k   Rn /N   R/N , where N   N /n [Eq. (1.5)] is the
                                                 d      d            d  d      A         A     d  d
                                         Avogadro constant. Thus
                                                                            1
                                                           R     8.314 J mol  K  1
                                                       k                            1.38   10  23  J>K      (3.57)
                                                                          23
                                                           N A  6.022   10  mol  1
                                             We have evaluated k in the statistical-mechanical formula S   k ln p   a. The fun-
                                         damental physical constant k, called Boltzmann’s constant, plays a key role in statis-
                                         tical mechanics. The connection between entropy and probability was first recognized
                                         in the 1870s by the physicist Ludwig Boltzmann. The application of S   k ln p   a
                                         to situations more complicated than the mixing of perfect gases requires knowledge of
                                         quantum and statistical mechanics. In Chapter 21 we shall obtain an equation that
                                         expresses the entropy of a system in terms of its quantum-mechanical energy levels.
                                         Our main conclusion for now is that entropy is a measure of the probability of a state.
                                         Apart from an additive constant, the entropy is proportional to the log of the proba-
                                         bility of the thermodynamic state.
                                             Equation (3.52) reads S   (R/N ) ln p   a. This relation is valid for any system,
                                                                       A
                                         not just an ideal gas. The occurrence of R in this general equation shows that the con-
                                         stant R is more universal and fundamental than one might suspect from its initial oc-
                                         currence in the ideal-gas law. (The same is true of the ideal-gas absolute temperature
                                         T.) We shall see in Chapter 21 that R/N , the gas constant per molecule (Boltzmann’s
                                                                          A
                                         constant), occurs in the fundamental equations governing the distribution of molecules
                                         among energy levels and thermodynamic systems among quantum states.
                                             Disordered states generally have higher probabilities than ordered states. For exam-
                                         ple, in the mixing of two gases, the disordered, mixed state is far more probable than the
                                         ordered, unmixed state. Hence it is often said that entropy is a measure of the molecular
                                         disorder of a state. Increasing entropy means increasing molecular disorder. However,
                                         order and disorder are subjective concepts, whereas probability is a precise quantitative
                                         concept. It is therefore preferable to relate S to probability rather than to disorder.
                                             For mixing two different gases, the connection between probability and entropy is
                                         clear. Let us examine some other processes. If two parts of a system are at different
                                         temperatures, heat flows spontaneously and irreversibly between the parts, accompa-
                                         nied by an increase in entropy. How is probability involved here? The heat flow oc-
                                         curs via collisions between molecules of the hot part with molecules of the cold part.
                                         In such collisions, it is more probable for the high-energy molecules of the hot part to
                                         lose some of their energy to the low-energy molecules of the cold part than for the re-
                                         verse to happen. Thus, internal energy is transferred from the hot body to the cold until
                                         thermal equilibrium is attained, at which point it is equally probable for molecular col-
                                         lisions to transfer energy from one part to the second part as to do the opposite. It is
                                         therefore more probable for the internal molecular translational, vibrational, and rota-
                                         tional energies to be spread out among the parts of the system than for there to be an
                                         excess of such energy in one part.
                                             Now consider an isolated reaction mixture of H , Br , and HBr gases. During
                                                                                           2
                                                                                       2
                                         molecular collisions, energy transfers can occur that break bonds and allow the for-
                                         mation of new chemical species. There will be a probability for each possible outcome
                                         of each possible kind of collision, and these probabilities, together with the numbers
                                         of molecules of each species present, determine whether there is a net reaction to give
                                         more HBr or more H and Br . When equilibrium is reached, the system has attained
                                                           2      2
                                         the most probable distribution of the species present over the available energy levels
                                         of H , Br , and HBr.
                                             2   2
                                             These last two examples indicate that entropy is related to the distribution or
                                         spread of energy among the available molecular energy levels. The total energy of an
   114   115   116   117   118   119   120   121   122   123   124