Basic formula in information theory
10 Apr 2017 | Machine Learning Information Theory Entropy KL DivergenceSelf-information
Entropy (Shannon Entropy)
Expectation of self-information
Joint entropy
Cross entropy
Mutual information
Basic property of mutual information
Kullback-Leibler divergence (information gain)
Basic property of KL divergence
- The KL divergence is always non-negative
- The KL divergence is not symmetric
- The relation between KL divergence and cross entropy
References
- Wikipedia
Comments