Tsallis entropy

Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann-Gibbs entropy. It was an extension put forward by Constantino Tsallis in 1988. It is defined as

:S_q(p) = {1 over q - 1} left( 1 - int p^q(x), dx ight),

or in the discrete case

:S_q(p) = {1 over q - 1} left( 1 - sum_x p^q(x) ight).

In this case, "p" denotes the probability distribution of interest, and "q" is a real parameter. In the limit as "q" → 1, the normal Boltzmann-Gibbs entropy is recovered.

The parameter "q" is a measure of the non-extensitivity of the system of interest. There are continuous and discrete versions of this entropic measure.

Various relationships

The discrete Tsallis entropy satisfies

:S_q = - left [ D_q sum_i p_i^x ight ] _{x=1}

where "D""q" is the q-derivative.

Non-extensivity

Given two independent systems "A" and "B", for which the joint probability density satisfies

:p(A, B) = p(A) p(B),,

the Tsallis entropy of this system satisfies

:S_q(A,B) = S_q(A) + S_q(B) + (1-q)S_q(A) S_q(B).,

From this result, it is evident that the parameter |1-q| is a measure of the departure from extensivity. In the limit when "q" = 1,

:S(A,B) = S(A) + S(B),,

which is what is expected for an extensive system.

See also

*Rényi entropy

External links

* [http://www.cscs.umich.edu/~crshalizi/notabene/tsallis.html Tsallis Statistics, Statistical Mechanics for Non-extensive Systems and Long-Range Interactions]


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Nonextensive entropy — Entropy is considered to be an extensive property, i.e., that its value depends on the amount of material present. Constantino Tsallis has proposed a nonextensive entropy, which is a generalization of the traditional Boltzmann Gibbs entropy. The… …   Wikipedia

  • Constantino Tsallis — Born 1943 Athens, Greece Residence Rio de Janeiro, Brazil Citizensh …   Wikipedia

  • Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… …   Wikipedia

  • Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… …   Wikipedia

  • Free entropy — A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also know as a Massieu, Planck, or Massieu Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies… …   Wikipedia

  • Social-circles network model — The generative model of feedback networks [Cited by Wei, Wang, Qiuping, Nivanen, Lauret et al (2006 01 12) [http://www.citebase.org/abstract?id=oai%3AarXiv.org%3Aphysics%2F0601091 How to fit the degree distribution of the air network?] ] , [Cited …   Wikipedia

  • Central limit theorem — This figure demonstrates the central limit theorem. The sample means are generated using a random number generator, which draws numbers between 1 and 100 from a uniform probability distribution. It illustrates that increasing sample sizes result… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”