Kullback–Leibler information

Show Summary Details

Quick Reference

A measure of the difference between two probability density functions (f and g, say) taking non-zero values on the same interval (a, b). Usually denoted by I, it is related to entropy and is defined as the expectation of the logarithm of the ratio of likelihoods. The information in favour of f is . An equivalent expression holds for the comparison of two discrete distributions. The measure was introduced in 1951 by Kullback and Leibler. See also Bhattacharya distance; Hellinger distance.

Subjects: Probability and Statistics.

Reference entries

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.