- Entropy can intuitively be thought of as a measure of information, for any probability distribution or in other words, uncertainty of a random variable.
- Mutual information is the measure of the amount of information one random variable contains about another.
- Relative Entropy is the measure of the distance between two probability distributions.
Note
Entropy can be thought of as the self-information of a random variable and mutual information is a special case of relative entropy.
Entropy
Definition
The entropy of a discrete random variable is defined by
Joint Entropy
Definition
The joint entropy of a pair of discrete random variables with a joint distribution is defined as
also,
Relative Entropy or Kullback-Leibler Distance
Definition
The relative entropy or Kullback-Leibler distance between two probability mass functions and is defined as
Mutual Information
Definition
Consider two random variables and with a joint probability mass function and marginal probability mass functions and . The mutual information is the relative entropy between the joint distribution and the product distribution :