Entropy of mixing

﻿
Entropy of mixing

The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the spatial locations of the different kinds of molecules when they are interspersed. We assume that the mixing process has reached thermodynamic equilibrium so that the mixture is uniform and homogeneous. If the substances being mixed are initially at different temperatures and pressures, there will, of course, be an additional entropy increase in the mixed substance due to these differences being equilibrated, but if the substances being mixed are initially at the same temperature and pressure, the entropy increase will be entirely due to the entropy of mixing.

The entropy of mixing may be calculated by Gibbs' Theorem which states that when two different substances mix, the entropy increase upon mixing is equal to the entropy increase that would occur if the two substances were to expand alone into the mixing volume. (In this sense, then the term "entropy of mixing" is a misnomer, since the entropy increase is not due to any "mixing" effect.) Nevertheless, the two substances must be different for the entropy of mixing to exist. This is the Gibbs paradox which states that if the two substances are identical, there will be no entropy change, yet the slightest detectable difference between the two will yield a considerable entropy change, and this is just the entropy of mixing. In other words, the entropy of mixing is not a continuous function of the degree of difference between the two substances.

The entropy of mixing $Delta S_m,$ is given by:

:$Delta S_m = -nR\left(x_1ln x_1 + x_2ln x_2\right),$

where $R,$ is the gas constant, $n,$ the total number of moles and $x_i,$ the mole fraction of each of the mixed components.

Note that the increase in entropy involves no heat flow (just the irreversible process of mixing), so the equation

$dS = frac\left\{dQ\right\}\left\{T\right\}$

(see Entropy) does not apply [Statistical and Thermal Physics, M. D. Sturge, A.K. Peters, 2003, p181.] . Since there is no change in internal energy ($U$), volume ($V$), or enthalpy ($H$), there is no exchange of heat or work with the surroundings and so the thermal entropy of the system and the surroundings will not change [Physical Chemistry, S. C. Wallwork and D. J. W. Grant, Longman, 1977, p292.] .

Proof

Assume that the molecules of two different substances are approximately the same size, and regard space as subdivided into a whose cells are the size of the molecules. (In fact, any lattice would do, including close packing.) This is a crystal-like conceptual model to identify the molecular centers of mass. If the two phases are liquids, there is no spatial uncertainty in each one individually.Ref|1 Everywhere we look in component 1, there is a molecule present, and likewise for component 2. After they are intermingled (assuming they are miscible), the liquid is still dense with molecules, but now there is uncertainty about what kind of molecule is in which location. Of course, any idea of identifying molecules in given locations is a thought experiment, not something one could do, but the calculation of the uncertainty is well-defined.

We can use Boltzmann's equation for the entropy change as applied to the "mixing" process

:$Delta S_m= k_B lnOmega,$

where $k_B ,$ is Boltzmann’s constant. We then calculate the number of ways $Omega ,$ of arranging $N_1,$ molecules of component 1 and $N_2,$ molecules of component 2 on a lattice, where

:$N = N_1 + N_2 ,$

is the total number of molecules, and therefore the number of lattice sites.Calculating the number of permutations of $N,$ objects, correcting for the fact that $N_1,$ of them are "identical" to one another, and likewise for $N_2,$,

:$Omega = N!/N_1!N_2!,$

After applying Stirling's approximation, the result is

:$Delta S_m = -k_B \left[N_1ln\left(N_1/N\right) + N_2ln\left(N_2/N\right)\right] ,$

This expression can be generalized to a mixture of $r,$ components, $N_i,$, with $i = 1, 2, 3,... r,$

:$Delta S_m =-k_Bsum_\left\{i=1\right\}^r N_iln\left(N_i/N\right) = -k_Bsum_\left\{i=1\right\}^r N_iln x_i,!$

where we have introduced the mole fractions, which are also the probabilities of finding any particular component in a given lattice site.

::$x_i = N_i/N = p_i,!$

A more direct and logically transparent derivation, not requiring Stirling's approximation, is to start with the Shannon entropy or compositional uncertaintyRef|2

:$-k_B,sum_\left\{i=1\right\}^r p_i ln \left(p_i\right)$

The summation is over the various chemical species, so this is the uncertainty about which kind of molecule is in any one site. It must be multiplied by the number of sites $N,$ to get the uncertainty for the whole system. The entropy of mixing from above can be rearranged as

:$Delta S_m = -k_BNsum_\left\{i=1\right\}^r \left(N_i/N\right)ln\left(N_i/N\right),!$

The equivalence of the two follows immediately.

Reverting to two components, we obtain

:$Delta S_m = - R\left(n_1ln x_1 + n_2ln x_2\right) = -nR\left(x_1ln x_1 + x_2ln x_2\right),$

where $R,$ is the gas constant, equal to $k_B ,$ times Avogadro's number, $n_1,$ and $n_2,$ are the numbers of moles of the components, and $n,$ is the total number of moles.Since the mole fractions are necessarily less than one, the values of the logarithms are negative. The minus sign reverses this, giving a positive entropy of mixing, as expected.

Gibbs free energy of mixing

In an ideal gas or ideal solution (no enthalpy term) the Gibbs free energy change of mixing is given by: :$Delta G_m = nRT\left(x_1ln x_1 + x_2ln x_2\right),$

where $Delta G_m$ is the Gibbs free energy and $T$ the absolute temperature [Graph as function of temperature, number of moles and composition: [http://www.whfreeman.com/elements/content/livinggraphs/E3012.html Link] .]

The Gibbs energy is always negative meaning that mixing as ideal solutions is always spontaneous. The lowest value is when the mole fraction is 0.5 for a mixture of two components or 1/n for a mixture of n components.

olutions

If the solute is a crystalline solid, the argument is much the same. A crystal has no spatial uncertainty at all, except for crystallographic defects, and a (perfect) crystal allows us to localize the molecules using the crystal symmetry group. The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed phases. If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid.

The Flory-Huggins solution theory provides the entropy of mixing for polymer solutions, in which the macromolecules are huge compared to the solute molecules. In this case, the assumption is made that each monomer subunit in the polymer chain occupies a lattice site.

Note that solids in contact with each other also slowly interdiffuse, and solid mixtures of two or more components may be made at will (alloys, semiconductors, etc.). Again, the same equations for the entropy of mixing apply, but only for homogeneous, uniform phases.

Gases

In gases there is a lot more spatial uncertainty because most of their volume is merely empty space. We can regard the mixing process as simply conjoining the two containers. The two lattices which allow us to conceptually localize molecular centers of mass also join. The total number of empty cells is the sum of the numbers of empty cells in the two components prior to mixing. Consequently, that part of the spatial uncertainty concerning whether "any" molecule is present in a lattice cell is the sum of the initial values, and does not increase upon mixing.

Almost everywhere we look, we find empty lattice cells. But we do find molecules in those few cells which are occupied. For each one, there is a "contingent" uncertainty about which kind of molecule it is. Using conditional probabilities, it turns out that the analytical problem for the small subset of occupied cells is exactly the same as for mixed liquids, and the "increase" in the entropy, or spatial uncertainty, has exactly the same form as obtained previously. Obviously the subset of occupied cells is not the same at different times. But only when an occupied cell is found do we ask which kind of molecule is there.

See also: Gibbs paradox, in which it would seem that mixing two samples of the "same" gas would produce entropy.

In a paper on this topic [Lost Work and the Entropy of Mixing, R. G. Keesing, European Journal of Physics, 1986, Vol.7, pp266-268] , R. G. Keesing notes:

It is often implied that the mixing of two perfect gasesleads to an increase in entropy. However when twoperfect gases, in thermal equilibrium, mix there is noincrease in entropy solely as a result of their mixing.The entropy increase on mixing through mutualdiffusion is solely the result of the two individual gasesoccupying larger volumes and is the same whether ornot the gases have mixed. The increase in entropycomes about because each gas absorbs heat in theprocess of isothermal expansion which is the samewhether or not the gases have mixed. Thus the ideathat increases in entropy, in non-interacting systems, issomehow connected with the mixing process is false.

Notes

* 1. This is, of course, an approximation. Liquids have a “free volume” which is why they are (usually) less dense than solids.
* 2. Claude Shannon introduced this expression for use in information theory, but similar formulas can be found as far back as the work of Ludwig Boltzmann and J. Willard Gibbs. Shannon uncertainty is completely unrelated to the Heisenberg uncertainty principle in quantum mechanics.

* [http://www.phys.uri.edu/~gerhard/PHY525/tln25.pdf Online lecture]
* [http://www.msm.cam.ac.uk/phase-trans/mphil/MP4-3.pdf Online lecture]

References

Wikimedia Foundation. 2010.

Look at other dictionaries:

• Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

• Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the …   Wikipedia

• Entropy (classical thermodynamics) — In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure,… …   Wikipedia

• Entropy (arrow of time) — Entropy is the only quantity in the physical sciences that picks a particular direction for time, sometimes called an arrow of time. As one goes forward in time, the second law of thermodynamics says that the entropy of an isolated system can… …   Wikipedia

• Entropy (order and disorder) — Boltzmann s molecules (1896) shown at a rest position in a solid In thermodynamics, entropy is commonly associated with the amount of order, disorder, and/or chaos in a thermodynamic system. This stems from Rudolf Clausius 1862 assertion that any …   Wikipedia

• entropy — entropic /en troh pik, trop ik/, adj. entropically, adv. /en treuh pee/, n. 1. Thermodynam. a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not… …   Universalium

• Configuration entropy — In statistical mechanics, configuration entropy is the portion of a system s entropy that is related to the position of its constituent particles rather than to their velocity or momentum. It is physically related to the number of ways of… …   Wikipedia

• Maximum entropy thermodynamics — In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon… …   Wikipedia

• Gibbs paradox — In statistical mechanics, a semi classical derivation of the entropy that doesn t take into account the indistinguishability of particles, yields an expression for the entropy which is not extensive (is not proportional to the amount of substance …   Wikipedia

• Flory-Huggins solution theory — is a mathematical model of the thermodynamics of polymer solutions which takes account of the great dissimilarity in molecular sizes in adapting the usual expression for the entropy of mixing. The result is an equation for the Gibbs free energy… …   Wikipedia