Joint quantum entropy

Joint quantum entropy

The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states ho and sigma, represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure the total uncertainty or entropy of the joint system. It is written S( ho,sigma) or H( ho,sigma), depending on the notation being used for the von Neumann entropy. Like other entropies, the joint quantum entropy is measured in bits, i.e. the logarithm is taken in base 2.

In this article, we will use S( ho,sigma) for the joint quantum entropy.

Background

In information theory, for any classical random variable X, the classical Shannon entropy H(X) is a measure of how uncertain we are about the outcome of X. For example, if X is a probability distribution concentrated at one point, the outcome of X is certain and therefore its entropy H(X)=0. At the other extreme, if X is the uniform probability distribution with n possible values, intuitively one would expect X is associated with the most uncertainty. Indeed such uniform probability distributions have maximum possible entropy H(X) = log_2(n).

In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state ho, the von Neumann entropy is defined by

:- operatorname{Tr} ho log ho.

Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy S( ho) (or sometimes H( ho).

Definition

Given a quantum system with two subsystems "A" and "B", the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems.In symbols, if the combined system is in state ho^{AB},

the joint quantum entropy is then

:S( ho^A, ho^B) = S( ho^{AB}) = -operatorname{Tr}( ho^{AB}log( ho^{AB})).

Each subsystem has it own entropy. The state of the subsystems are given by the partial trace operation.

Properties

The classical joint entropy is always at least equal to the entropy of each individual system. This is not the case for the joint quantum entropy. If the quantum state ho^{AB} exhibits quantum entanglement, then the entropy of each subsystem may be larger than the joint entropy. This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be.

Consider a maximally entangled state such as a Bell state. If ho^{AB} is a Bell state, say,

:left| Psi ight angle = frac{1}{sqrt{2left(|00 angle + |11 angle ight),

then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy log 2 = 1. Thus the joint entropy of the combined system is less than that of subsystems. This is because for entangled states, definite states cannot be assigned to subsystems, resulting in positive entropy.

Notice that the above phenomenon cannot occur if a state is a separable pure state. In that case, the reduced states of the subsystems are also pure. Therefore all entropies are zero.

Relations to Other Entropy Measures

The joint quantum entropy S( ho^{AB}) can be used to define of the conditional quantum entropy:

:S( ho^A| ho^B) stackrel{mathrm{def{=} S( ho^A, ho^B) - S( ho^B)

and the quantum mutual information:

:S( ho^A: ho^B) stackrel{mathrm{def{=} S( ho^A) + S( ho^B) - S( ho^A, ho^B)

These definitions parallel the use of the classical joint entropy to define the conditional entropy and mutual information.

See also

*Quantum relative entropy

*Quantum mutual information

References

* Nielsen, Michael A. and Isaac L. Chuang, "Quantum Computation and Quantum Information". Cambridge University Press, 2000. ISBN 0521632358


Wikimedia Foundation. 2010.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Conditional quantum entropy — The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on… …   Wikipedia

  • Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like …   Wikipedia

  • Quantum entanglement — Quantum mechanics Uncertainty principle …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… …   Wikipedia

  • Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… …   Wikipedia

  • List of information theory topics — This is a list of information theory topics, by Wikipedia page.*A Mathematical Theory of Communication *algorithmic information theory *arithmetic encoding *channel capacity *Communication Theory of Secrecy Systems *conditional entropy… …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”