- Entropy rate
The

**entropy rate**of astochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with acountable index, the entropy rate $H(X)$ is the limit of thejoint entropy of $n$ members of the process $X\_k$ divided by $n$, as $n$ tends toinfinity ::$H(X)\; =\; lim\_\{n\; o\; infty\}\; frac\{1\}\{n\}\; H(X\_1,\; X\_2,\; dots\; X\_n)$

when the limit exists. An alternative, related quantity is:

:$H\text{'}(X)\; =\; lim\_\{n\; o\; infty\}\; H(X\_n|X\_\{n-1\},\; X\_\{n-2\},\; dots\; X\_1)$

For

strongly stationary stochastic processes, $H(X)\; =\; H\text{'}(X)$.**Entropy rates for Markov chains**Since a stochastic process defined by a

Markov chain which isirreducible andaperiodic has astationary distribution , the entropy rate is independent of the initial distribution.For example, for such a Markov chain "Y"

_{"k"}defined on acountable number of states, given thetransition matrix "P"_{"ij"}, "H"("Y") is given by::$displaystyle\; H(Y)\; =\; -\; sum\_\{ij\}\; mu\_i\; P\_\{ij\}\; log\; P\_\{ij\}$

where "μ"

_{"i"}is thestationary distribution of the chain.A simple consequence of this definition is that the entropy rate of an i.i.d.

stochastic process has an entropy rate that is the same as theentropy of any individual member of the process.**ee also***

Information source (mathematics)

*Markov information source **References*** Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., ISBN 0471062596 [

*http://www3.interscience.wiley.com/cgi-bin/bookhome/110438582?CRETRY=1&SRETRY=0*]**External links*** [

*http://www.eng.ox.ac.uk/samp Systems Analysis, Modelling and Prediction (SAMP), University of Oxford*]MATLAB code for estimating information-theoretic quantities for stochastic processes.

*Wikimedia Foundation.
2010.*

### Look at other dictionaries:

**Entropy (information theory)**— In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia**Entropy**— This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia**Rate–distortion theory**— is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal amount of entropy (or information) R that should be communicated over a channel, so … Wikipedia**Rate–distortion optimization**— (also RDO or RD) is a method of improving video quality in video compression. The name refers to the optimization of the amount of distortion (loss of video quality) against the amount of data required to encode the video, the rate . While it is… … Wikipedia**Entropy encoding**— In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix free code to each… … Wikipedia**Entropy (disambiguation)**— Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… … Wikipedia**Entropy (general concept)**— In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… … Wikipedia**Entropy (anesthesiology)**— In anesthesiology, entropy is a neurophysiologic monitor of the patient s cerebral cortical function, designed to express the likelihood of consciousness, intraoperative awareness, or awareness with recall during a surgical procedure.Other Vital… … Wikipedia**entropy**— [ ɛntrəpi] noun 1》 Physics a thermodynamic quantity representing the unavailability of a system s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. 2》 (in information… … English new terms dictionary**entropy**— n. 1 Physics a measure of the unavailability of a system s thermal energy for conversion into mechanical work. 2 Physics a measure of the disorganization or degradation of the universe. 3 a measure of the rate of transfer of information in a… … Useful english dictionary