Rényi entropy
In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The Rényi entropy is named after Alfréd Rényi.[1]
The Rényi entropy is important in ecology and statistics as indices of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group.[2][3] In theoretical computer science, the min-entropy is used in the context of randomness extractors.
Definition
The Rényi entropy of order , where and , is defined as
- .[1]
Here, is a discrete random variable with possible outcomes and corresponding probabilities for , and the logarithm is base 2. If the probabilities are for all , then all the Rényi entropies of the distribution are equal: . In general, for all discrete random variables , is a non-increasing function in .
Applications often exploit the following relation between the Rényi entropy and the p-norm of the vector of probabilities:
- .
Here, the discrete probability distribution is interpreted as a vector in with and .
The Rényi entropy for any is Schur concave.
Special cases of the Rényi entropy
As approaches zero, the Rényi entropy increasingly weighs all possible events more equally, regardless of their probabilities. In the limit for , the Rényi entropy is just the logarithm of the size of the support of . The limit for is the Shannon entropy. As approaches infinity, the Rényi entropy is increasingly determined by the events of highest probability.
Hartley or max-entropy
Provided the probabilities are nonzero,[4] is the logarithm of the cardinality of X, sometimes called the Hartley entropy of X:
Shannon entropy
The limiting value of as is the Shannon entropy:[5]
Collision entropy
Collision entropy, sometimes just called "Rényi entropy," refers to the case ,
where X and Y are independent and identically distributed.
Min-entropy
In the limit as , the Rényi entropy converges to the min-entropy :
Equivalently, the min-entropy is the largest real number such that all events occur with probability at most .
The name min-entropy stems from the fact that it is the smallest entropy measure in the family of Rényi entropies. In this sense, it is the strongest way to measure the information content of a discrete random variable. In particular, the min-entropy is never larger than the Shannon entropy.
The min-entropy has important applications for randomness extractors in theoretical computer science: Extractors are able to extract randomness from random sources that have a large min-entropy; merely having a large Shannon entropy does not suffice for this task.
Inequalities between different values of α
That is non-increasing in , which can be proven by differentiation,[6] as
which is proportional to Kullback–Leibler divergence (which is always non-negative), where .
In particular cases inequalities can be proven also by Jensen's inequality:[7][8]
For values of , inequalities in the other direction also hold. In particular, we have[9]
On the other hand, the Shannon entropy can be arbitrarily high for a random variable that has a given min-entropy.
Rényi divergence
As well as the absolute Rényi entropies, Rényi also defined a spectrum of divergence measures generalising the Kullback–Leibler divergence.[10]
The Rényi divergence of order α or alpha-divergence of a distribution P from a distribution Q is defined to be:
when and . We can define the Rényi divergence for the special values by taking a limit, and in particular the limit gives the Kullback-Leibler divergence.
Some special cases:
- : minus the log probability under Q that pi > 0;
- : minus twice the logarithm of the Bhattacharyya coefficient; (Nielsen & Boltz (2009))
- : the Kullback-Leibler divergence;
- : the log of the expected ratio of the probabilities;
- : the log of the maximum ratio of the probabilities.
The Rényi divergence is indeed a divergence, meaning simply that is greater than or equal to zero, and zero only when P = Q. For any fixed distributions P and Q, the Rényi divergence is nondecreasing as a function of its order α, and it is continuous on the set of α for which it is finite.[10]
Why α=1 is special
The value α = 1, which gives the Shannon entropy and the Kullback–Leibler divergence, is special because it is only at α=1 that the chain rule of conditional probability holds exactly:
for the absolute entropies, and
for the relative entropies.
The latter in particular means that if we seek a distribution p(x,a) which minimizes the divergence from some underlying prior measure m(x,a), and we acquire new information which only affects the distribution of a, then the distribution of p(x|a) remains m(x|a), unchanged.
The other Rényi divergences satisfy the criteria of being positive and continuous; being invariant under 1-to-1 co-ordinate transformations; and of combining additively when A and X are independent, so that if p(A,X) = p(A)p(X), then
and
The stronger properties of the α = 1 quantities, which allow the definition of conditional information and mutual information from communication theory, may be very important in other applications, or entirely unimportant, depending on those applications' requirements.
Exponential families
The Rényi entropies and divergences for an exponential family admit simple expressions [11]
and
where
is a Jensen difference divergence.
Physical meaning
Renyi entropy in quantum physics is not considered an observable, due to its nonlinear dependence on density matrix. The Shannon entropy shares this nonlinear dependence. Recently, Nazarov showed a correspondence that reveals the physical meaning of the Renyi entropy flow in time. His proposal is similar to the fluctuation-dissipation theorem in spirit and allows to measure quantum entropy using full counting statistics (FCS) of energy transfers.[12][13][14]
See also
Notes
- 1 2 Rényi (1961)
- ↑ Franchini (2008)
- ↑ Its (2010)
- ↑ RFC 4086, page 6
- ↑ Bromiley, Thacker & Bouhova-Thacker (2004)
- ↑ Beck (1993)
- ↑ holds because .
- ↑ holds because .
- ↑ holds because
- 1 2 Van Erven, Tim; Harremoës, Peter (2014). "Rényi Divergence and Kullback-Leibler Divergence". IEEE Transactions on Information Theory 60 (7): 3797–3820. arXiv:1206.2459. doi:10.1109/TIT.2014.2320500.
- ↑ Nielsen & Nock (2011)
- ↑ Nazarov (2011)
- ↑ Nazarov (2015a)
- ↑ Nazarov (2015b)
References
- Beck, Christian; Schlögl, Friedrich (1993). Thermodynamics of chaotic systems: an introduction. Cambridge University Press. ISBN 0521433673.
- Jizba, P.; Arimitsu, T. (2004). "The world according to Rényi: Thermodynamics of multifractal systems". Annals of Physics (Elsevier) 312: 17–59. arXiv:cond-mat/0207707. Bibcode:2004AnPhy.312...17J. doi:10.1016/j.aop.2004.01.002.
- Jizba, P.; Arimitsu, T. (2004). "On observability of Rényi's entropy". Physical Review E (APS) 69: 026128–1 – 026128–12. arXiv:cond-mat/0307698. Bibcode:2004PhRvE..69b6128J. doi:10.1103/PhysRevE.69.026128.
- Bromiley, P.A.; Thacker, N.A.; Bouhova-Thacker, E. (2004), Shannon Entropy, Renyi Entropy, and Information (PDF)
- Franchini, F.; Its, A.R.; Korepin, V.E. (2008). "Rényi entropy as a measure of entanglement in quantum spin chain". Journal of Physics A: Mathematical and Theoretical (IOPScience) 41 (25302): 025302. arXiv:0707.2534. Bibcode:2008JPhA...41b5302F. doi:10.1088/1751-8113/41/2/025302.
- Hazewinkel, Michiel, ed. (2001), "Rényi test", Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4
- Hero, A. O.; Michael, O.; Gorman, J. (2002). "Alpha-divergences for Classification, Indexing and Retrieval" (PDF).
- Its, A. R.; Korepin, V. E. (2010). "Generalized entropy of the Heisenberg spin chain". Theoretical and Mathematical Physics (Springer) 164 (3): 1136–1139. Bibcode:2010TMP...164.1136I. doi:10.1007/s11232-010-0091-6. Retrieved 7 Mar 2012.
- Nielsen, F.; Boltz, S. (2010). "The Burbea-Rao and Bhattacharyya centroids". arXiv:1004.5049.
- Nielsen, Frank; Nock, Richard (2012). "A closed-form expression for the Sharma-Mittal entropy of exponential families". Journal of Physics A: Mathematical and Theoretical 45 (3): 032003. arXiv:1112.4221. Bibcode:2012JPhA...45c2003N. doi:10.1088/1751-8113/45/3/032003.
- Nielsen, Frank; Nock, Richard (2011). "On Rényi and Tsallis entropies and divergences for exponential families". arXiv:1105.3259. Bibcode:2011arXiv1105.3259N.
- Rényi, Alfréd (1961). "On measures of information and entropy" (PDF). Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561.
- Rosso, O.A. (2006). "EEG analysis using wavelet-based information tools". Journal of Neuroscience Methods 153 (2): 163–182. doi:10.1016/j.jneumeth.2005.10.009. PMID 16675027.
- Nazarov, Y. (2011). "Flows of Renyi entropies". Physical Review B 84: 205437. arXiv:1108.3537. Bibcode:2015PhRvB..91j4303A. doi:10.1103/PhysRevB.91.104303.
- Nazarov, Y. (2015). "Renyi entropy flows from quantum heat engines". Physical Review B 91: 104303. arXiv:1408.3910. Bibcode:2015PhRvB..91j4303A. doi:10.1103/PhysRevB.91.104303.
- Nazarov, Y. (2015). "Exact correspondence between Renyi entropy flows and physical flows". Physical Review B 91: 174307. arXiv:1502.08020. doi:10.1103/PhysRevB.91.174307.