Читать книгу Geochemistry - William M. White - Страница 43
2.6.2 Statistical mechanics: a microscopic perspective of entropy
ОглавлениеWhereas energy is a property for which we gain an intuitive feel through everyday experience, the concept of entropy is usually more difficult to grasp. Perhaps the best intuitive understanding of entropy can be obtained from the microscopic viewpoint of statistical mechanics. So for that reason, we will make the first of several brief excursions into the world of atoms, molecules, and quanta.
Let's return to our box of gas and consider what happens on a microscopic scale when we remove the partition. To make things tractable, we'll consider that each gas consists of only two molecules, so there are four all together, two red and two black. For this thought experiment, we will keep track of the individual molecules so we label them 1red, 2red, 1black, 2black. Before we removed the partition, the red molecules were on one side and the black ones on the other. Our molecules have some thermal energy, so they are free to move around. So by removing the partition, we are essentially saying that each molecule is equally likely to be found in either side of the box.
Before we removed the partition, there was only one possible arrangement of the system; this is shown in Figure 2.5a. Once we remove the partition, we have four molecules and two subvolumes, and a total of 24 = 16 possible configurations (Figure 2.5b) of the system. The basic postulate of statistical mechanics is: a system is equally likely to be found in any of the states accessible to it. Thus, we postulate that each of these configurations is equally likely. Only one of these states corresponds to the original one (all red molecules on the left). Thus, the probability of the system being found in its original state is 1/16. That is not particularly improbable. However, suppose that we had altogether a mole of gas (≈6 × 1023 molecules). The probability of the system ever being found again in its original state is then ≈2×1024, which is unlikely indeed.
Now consider a second example. Suppose that we have two copper blocks of identical mass at different temperatures and separated by a thermally insulating barrier (Figure 2.6). Imagine that our system, which is the two copper blocks, is isolated in space so that the total energy of the system remains constant. What happens if we remove the insulating barrier? Experience tells us that the two copper blocks will eventually come into thermal equilibrium, and their temperatures will eventually be identical.
Figure 2.5 Possible distribution of molecules of a red and a black gas in a box before (a) and (b) after removal of a partition separating them.
Now let's look at this process on a microscopic scale. We have already mentioned that temperature is related to internal energy. As we shall see, this relationship will differ depending on the nature and mass of the material of interest, but since our blocks are of identical size and composition, we can assume that temperature and energy are directly related in this case. Suppose that before we remove the insulation, the left block has one unit of energy and the right one has five units (we can think of these as quanta, but this is not necessary). The question is, how will energy be distributed after we remove the insulation?
Figure 2.6 Two copper blocks at different temperatures separated by an insulator. When the insulator is removed and the blocks brought in contact, the blocks come to thermal equilibrium. Entropy increases in this process.
In the statistical mechanical viewpoint, we cannot determine how the energy will be distributed; we can only compute the possible ways it could be distributed. Each of these energy distributions is then equally likely according to the basic postulate. So let's examine how it can be distributed. Since we assume that the distribution is completely random, we proceed by randomly assigning the first unit to either the left or right block, then the second unit to either, and so on. With six units of energy, there are already more ways of distributing it (26 = 64) than we have space to enumerate here. For example, there are six ways energy can be distributed so that the left block has one unit and the right one has five units. This is illustrated in Figure 2.7. However, since we can't actually distinguish the energy units, all these ways are effectively identical. There are 15 ways, or combinations, to distribute the energy so that the left block has two units and the right has four units. Similarly, there are 15 combinations where the left block has four units and the right has two units. For this particular example, the rule is that if there are a total of E units of energy, e of which are assigned to the left block and (E–e) to the right, then there will be Ω(e) identical combinations where Ω(e) is calculated‡ as:
Figure 2.7 There are six possible ways to distribute six energy units so that the left block has one unit and the right block has five units.
(2.36)†
Here we use Ω(e) to denote the function that describes the number of states accessible to the system for a given value of e. In this particular example, “states accessible to the system” refers to a given distribution of energy units between the two blocks. According to eqn. 2.36 there are 20 ways of distributing our six units of energy so that each block has three. There is, of course, only one way to distribute energy so that the left block has all of the energy and only one combination where the right block has all of it.
According to the basic postulate, any of the 64 possible distributions of energy are equally likely. The key observation, however, is that there are many ways to distribute energy for some values of e and only a few for other values. Thus the chances of the system being found in a state where each block has three units is 20/64 = 0.3125, whereas the chances of the system being in the state with the original distribution (one unit to the left, five to the right) are only 6/64 = 0.0938. So it is much more likely that we will find the system in a state where energy is equally divided than in the original state.
Of course, two macroscopic blocks of copper at any reasonable temperature will have far more than 6 quanta of energy. Let's take a just slightly more realistic example and suppose that they have a total of 20 quanta and compute the distribution. There will be 220 possible distributions, far too many to consider individually, so let's do it the easy way and use eqn. 2.36 to produce a graph of the probability distribution. Equation 2.36 gives the number of identical states of the system for a given value of e. The other thing that we need to know is that the chances of any one of these states occurring is simply (1/2)20. So to compute the probability of a particular distinguishable distribution of energy occurring, we multiply this probability by Ω. More generally, the probability, P, will be:
(2.37)
where p is the probability of an energy unit being in the left block and q is the probability of it being in the right. This equation is known as the binomial distribution.§ Since both p and q are equal to 0.5 in our case (if the blocks were of different mass or of different composition, p and q would not be equal), the product peqE–e is just pE and eqn. 2.37 simplifies to:
(2.38)
Since pE is a constant (for a given value of E and configuration of the system), the probability of the left block having e units of energy is directly proportional to Ω(e). It turns out that this is a general relationship, so that for any system we may write:
(2.39)
where ƒ is some property describing the system and C is some constant (in this case 0.520). Figure 2.8a shows the probability of the left block having e units of energy. Clearly, the most likely situation is that both will have approximately equal energy. The chance of one block having 1 unit and the other 19 units is very small (2 × 10−5 to be exact). In reality, of course, the number of quanta of energy available to the two copper blocks will be of the order of multiples of the Avogadro number. If one or the other block has 10 or 20 more units or even 1010 more quanta than the other, we wouldn't be able to detect it. Thus, energy will always appear to be distributed evenly between the two, once the system has had time to adjust.
Figure 2.8b shows Ω as a function of e, the number of energy units in the left block. Comparing the two, as well as eqn. 2.38, we see that the most probable distribution of energy between the blocks corresponds to the situation where the system has the maximal number of states accessible to it (i.e., to where Ω(e) is maximum).
According to our earlier definition of equilibrium, the state ultimately reached by this system when we removed the constraint (the insulation) is the equilibrium one. We can see here that, unlike the ball on the hill, we cannot determine whether this system is at equilibrium or not simply from its energy: the total energy of the system remained constant. In general, for a thermodynamic system, whether the system is at equilibrium depends not on its total energy but on how that energy is internally distributed.
Clearly, it would be useful to have a function that could predict the internal distribution of energy at equilibrium. The function that does this is the entropy. To understand this, let's return to our copper blocks. Initially, a thermal barrier separates the two copper blocks and we can think of each as an isolated system. We assume that each has an internal energy distribution that is at or close to the most probable one (i.e., each is internally at equilibrium). Each block has its own function Ω (which we denote as Ωl and Ωr for the left and right block, respectively) that gives the number of states accessible to it at a particular energy distribution. We assume that initial energy distribution is not the final one, so that when we remove the insulation, the energy distribution of the system will spontaneously change. In other words:
Figure 2.8 (a) Probability of one of two copper blocks of equal mass in thermal equilibrium having e units of energy when the total energy of the two blocks is 20 units. (b) Ω, number of states available to the system (combinations of energy distribution) as a function of e.
where we use the superscripts i and f to denote initial and final, respectively.
When the left block has energy e, it can be in any one of Ωl = Ω(e) possible states, and the right block can be in any one of Ωr = Ω(E−e) states. Both and Ω are multiplicative, so the total number of possible states after we remove the insulation, Ω, will be:
To make and Ω additive, we simply take the log:
(2.40)
and
(2.41)
As additive properties, ln and ln Ω are consistent with our other extensive state variables (e.g., U, V).
We want to know which energy distribution (i.e., values of e and E – e) is the most likely one, because that corresponds to the equilibrium state of the system. This is the same as asking where the probability function, (e), is maximum. Maximum values of functions have the useful property that they occur at points where the derivative of the function is 0. That is, a maximum of function ƒ(x) will occur where dƒ(x)/d(x) = 0.* Thus, the maximum value of in Figure 2.8 occurs where d/de = 0. The most probable energy distribution will therefore occur at:
(2.42)
(we use the partial differential notation to indicate that, since the system is isolated, all other state variables are held constant). Substituting eqn. 2.41 into 2.42, we have:
(2.43)
(since C is a constant). Then substituting eqn. 2.40 into 2.43 we have:
(2.44)
so the maximum occurs at:
(2.45)
The maximum then occurs where the function ∂lnΩ/∂e for each of the two blocks are equal (the negative sign will cancel because we are taking the derivative ∂ƒ(−e)/∂e). More generally, we may write:
(2.46)
Notice two interesting things: the equilibrium energy distribution is the one where ln Ω is maximum (since it is proportional to P) and where the functions ∂lnΩ/∂E of the two blocks are equal. It would appear that both are very useful functions. We define entropy, S,† as:
(2.47)†
and a function β such that:
(2.48)
where k is a constant (which turns out to be Boltzmann's constant or the gas constant; the choice depends on whether we work in units of atoms or moles, respectively). The function S then has the property that it is maximum at equilibrium and β has the property that it is the same in every part of the system at equilibrium.
Entropy also has the interesting property that in any spontaneous reaction, the total entropy of the system plus its surroundings must increase. In our example, this is a simple consequence of the observation that the final probability, (E), and therefore also Ω, will be maximum and hence never be less than the original one. Because of that, the final number of accessible states must exceed the initial number and:
(2.49)
rearranging:
The quantities in brackets are simply the entropy changes of the two blocks. Hence:
(2.50)
In other words, any decrease in entropy in one of the blocks must be at least compensated for by an increase in entropy of the other block.
For an irreversible process, that is, a spontaneous one such as thermal equilibrium between two copper blocks, we cannot determine exactly the increase in entropy. Experience has shown, however, that the increase in entropy will always exceed the ratio of heat exchanged to temperature. Thus, the mathematical formulation of the second law is:
(2.51)
Like the first law, eqn. 2.51 cannot be derived or formally proven; it is simply a postulate that has never been contradicted by experience. For a reversible reaction, that is, one that is never far from equilibrium and therefore one where dQ is small relative to T,
(2.52)
(see Example 2.1). In thermodynamics, we restrict our attention to systems that are close to equilibrium, so eqn. 2.52 serves as an operational definition of entropy.