Free entropy

Free entropy

A thermodynamic "free entropy" is an entropic thermodynamic potential analogous to the free energy. Also know as a Massieu, Planck, or Massieu-Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies frequently appear as the logarithm of a partition function. In mathematics, free entropy is the generalization of entropy defined in free probability.

A free entropy is generated by a Legendre transform of the entropy. The different potentials correspond to different constraints to which the system may be subjected. The most common examples are:

::S is entropy::Phi is the Massieu potentialcite web |author=Antoni Planes |coauthors=Eduard Vives |date=2000-10-24 |publisher=Universitat de Barcelona |url=http://www.ecm.ub.es/condensed/eduard/papers/massieu/node2.html |title=Entropic variables and Massieu-Planck functions |accessdate=2007-09-18 |work=Entropic Formulation of Statistical Mechanics ] [cite journal |author=T. Wada |coauthors=A.M. Scarfone |year=2004 |month=12 |title=Connections between Tsallis’ formalisms employing the standard linear average energy and ones employing the normalized q-average energy |journal=Physics Letters, Section A: General, Atomic and Solid State Physics |volume=335 |issue=5-6 |pages=351–362 |doi=10.1016/j.physleta.2004.12.054 |url=http://arxiv.org/abs/cond-mat/0410527v1 |accessdate=2007-09-18] ::Xi is the Planck potential::U is internal energy::T is temperature::P is pressure::V is volume::A is Helmholtz free energy::G is Gibbs free energy::N_i is number of particles (or number of moles) composing the "i"-th chemical component::mu_i is the chemical potential of the "i"-th chemical component::s is the total number of components::i is the ith components

Note that the use of the terms "Massieu" and "Planck" for explicit Massieu-Planck potentials are somewhat obscure and ambiguous. In particular "Planck potential" has alternative meanings. The most standard notation for an entropic potential is psi, used by both Planck and Schrödinger. (Note that Gibbs used psi to denote the free energy.) Free entropies where invented by Massieu in 1869, and actually predate Gibb's free energy (1875).

Dependence of the potentials on the natural variables

Entropy

:S = S(U,V,{N_i})

By the definition of a total differential,

:d S = frac {partial S} {partial U} d U + frac {partial S} {partial V} d V + sum_{i=1}^s frac {partial S} {partial N_i} d N_i.

From the equations of state,

:d S = frac{1}{T}dU+frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i.

The differentials in the above equation are all of extensive variables, so they may be integrated to yield

:S = frac{U}{T}+frac{p V}{T} + sum_{i=1}^s (- frac{mu_i N}{T}).

Massieu potential Helmholtz free entropy

:Phi = S - frac {U}{T}:Phi = frac{U}{T}+frac{P V}{T} + sum_{i=1}^s (- frac{mu_i N}{T}) - frac {U}{T}:Phi = frac{P V}{T} + sum_{i=1}^s (- frac{mu_i N}{T})

Starting over at the definition of Phi and taking the total differential, we have via a Legendre transform (and the chain rule)

:d Phi = d S - frac {1} {T} dU - U d frac {1} {T},:d Phi = frac{1}{T}dU+frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i - frac {1} {T} dU - U d frac {1} {T},:d Phi = - U d frac {1} {T}+frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d Phi we see that

:Phi = Phi(frac {1}{T},V,{N_i}).

If reciprocal variables are not desired,cite book
title=The Collected Papers of Peter J. W. Debye
publisher=Interscience Publishers, Inc.
place=New York, New York
year=1954
] rp|222

:d Phi = d S - frac {T d U - U d T} {T^2},:d Phi = d S - frac {1} {T} d U + frac {U} {T^2} d T,:d Phi = frac{1}{T}dU+frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i - frac {1} {T} d U + frac {U} {T^2} d T,:d Phi = frac {U} {T^2} d T + frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i,:Phi = Phi(T,V,{N_i}).

Planck potential Gibbs free entropy

:Xi = Phi -frac{P V}{T}:Xi = frac{P V}{T} + sum_{i=1}^s (- frac{mu_i N}{T}) -frac{P V}{T}:Xi = sum_{i=1}^s (- frac{mu_i N}{T})

Starting over at the definition of Xi and taking the total differential, we have via a Legendre transform (and the chain rule)

:d Xi = d Phi - frac{P}{T} d V - V d frac{P}{T}:d Xi = - U d frac {1} {T} + frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i - frac{P}{T} d V - V d frac{P}{T}:d Xi = - U d frac {1} {T} - V d frac{P}{T} + sum_{i=1}^s (- frac{mu_i}{T}) d N_i.

The above differentials are not all of extensive variables, so the equation may not be directly integrated. From d Xi we see that

:Xi = Xi(frac {1}{T},frac {P}{T},{N_i}).

If reciprocal variables are not desired,rp|222

:d Xi = d Phi - frac{T (P d V + V d P) - P V d T}{T^2},:d Xi = d Phi - frac{P}{T} d V - frac {V}{T} d P + frac {P V}{T^2} d T,:d Xi = frac {U} {T^2} d T + frac{P}{T}dV + sum_{i=1}^s (- frac{mu_i}{T}) d N_i - frac{P}{T} d V - frac {V}{T} d P + frac {P V}{T^2} d T,:d Xi = frac {U + P V} {T^2} d T - frac {V}{T} d P + sum_{i=1}^s (- frac{mu_i}{T}) d N_i,:Xi = Xi(T,P,{N_i}).

References

*cite journal
first =M.F. |last = Massieu|year=1869 |title= Compt. Rend.
volume=69
issue= 858
pages= 1057

*cite book
first = Herbert B. | last = Callen | authorlink = Herbert Callen | year = 1985
title = Thermodynamics and an Introduction to Themostatistics | edition = 2nd Ed.
publisher = John Wiley & Sons | location = New York | id = ISBN 0-471-86256-8


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Free probability — is a mathematical theory which studies non commutative random variables. The freeness property is the analogue of the classical notion of independence, and it is connected with free products. This theory was initiated by Dan Voiculescu around… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (arrow of time) — Entropy is the only quantity in the physical sciences that picks a particular direction for time, sometimes called an arrow of time. As one goes forward in time, the second law of thermodynamics says that the entropy of an isolated system can… …   Wikipedia

  • Entropy and life — Much writing has been devoted to Entropy and life. Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began in around the turn of the 20th century. In 1910, American historian Henry Adams… …   Wikipedia

  • Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… …   Wikipedia

  • Entropy (classical thermodynamics) — In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure,… …   Wikipedia

  • Entropy (computing) — In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses that require random data. This randomness is often collected from hardware sources, either pre existing ones such as… …   Wikipedia

  • Entropy encoding — In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix free code to each… …   Wikipedia

  • Entropy (order and disorder) — Boltzmann s molecules (1896) shown at a rest position in a solid In thermodynamics, entropy is commonly associated with the amount of order, disorder, and/or chaos in a thermodynamic system. This stems from Rudolf Clausius 1862 assertion that any …   Wikipedia

  • Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”