Basic formula in information theory

|

Self-information

Entropy (Shannon Entropy)

Expectation of self-information

Joint entropy

Cross entropy

Mutual information

Basic property of mutual information

Kullback-Leibler divergence (information gain)

Basic property of KL divergence

  • The KL divergence is always non-negative
  • The KL divergence is not symmetric
  • The relation between KL divergence and cross entropy

References

  • Wikipedia

Comments