- Joint quantum entropy
The joint quantum entropy generalizes the classical
joint entropyto the context of quantum information theory. Intuitively, given two quantum states and , represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure the total uncertainty or entropyof the joint system. It is written or , depending on the notation being used for the von Neumann entropy. Like other entropies, the joint quantum entropy is measured in bits, i.e. the logarithm is taken in base 2.
In this article, we will use for the joint quantum entropy.
information theory, for any classical random variable, the classical Shannon entropyis a measure of how uncertain we are about the outcome of . For example, if is a probability distribution concentrated at one point, the outcome of is certain and therefore its entropy . At the other extreme, if is the uniform probability distribution with possible values, intuitively one would expect is associated with the most uncertainty. Indeed such uniform probability distributions have maximum possible entropy .
quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state , the von Neumann entropyis defined by
spectral theorem, or Borel functional calculusfor infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy (or sometimes .
Given a quantum system with two subsystems "A" and "B", the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems.In symbols, if the combined system is in state ,
the joint quantum entropy is then
Each subsystem has it own entropy. The state of the subsystems are given by the
The classical joint entropy is always at least equal to the entropy of each individual system. This is not the case for the joint quantum entropy. If the quantum state exhibits
quantum entanglement, then the entropy of each subsystem may be larger than the joint entropy. This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be.
Consider a maximally entangled state such as a
Bell state. If is a Bell state, say,
then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy . Thus the joint entropy of the combined system is less than that of subsystems. This is because for entangled states, definite states cannot be assigned to subsystems, resulting in positive entropy.
Notice that the above phenomenon cannot occur if a state is a separable pure state. In that case, the reduced states of the subsystems are also pure. Therefore all entropies are zero.
Relations to Other Entropy Measures
The joint quantum entropy can be used to define of the
conditional quantum entropy:
quantum mutual information:
These definitions parallel the use of the classical
joint entropyto define the conditional entropyand mutual information.
Quantum relative entropy
Quantum mutual information
* Nielsen, Michael A. and Isaac L. Chuang, "Quantum Computation and Quantum Information". Cambridge University Press, 2000. ISBN 0521632358
Wikimedia Foundation. 2010.
Look at other dictionaries:
Conditional quantum entropy — The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory. The conditional entropy is written S(ρ | σ), or H(ρ | σ), depending on… … Wikipedia
Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like … Wikipedia
Quantum entanglement — Quantum mechanics Uncertainty principle … Wikipedia
Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Conditional entropy — Individual (H(X),H(Y)), joint (H(X,Y)), and conditional entropies for a pair of correlated subsystems X,Y with mutual information I(X; Y). In information theory, the conditional entropy (or equivocation) quantifies the remaining entropy (i.e.… … Wikipedia
Rényi entropy — In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system. It is named after Alfréd Rényi. The Rényi entropy of order α,… … Wikipedia
List of information theory topics — This is a list of information theory topics, by Wikipedia page.*A Mathematical Theory of Communication *algorithmic information theory *arithmetic encoding *channel capacity *Communication Theory of Secrecy Systems *conditional entropy… … Wikipedia
Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P … Wikipedia
Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… … Wikipedia