KL Divergence

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2023 Mar 23 2:20
Editor
Edited
Edited
2025 Mar 9 23:59

Relative Entropy, Kullback Leibler Divergence, I-divergence

a metric to compare two distributions (asymmetric metric)

Divergence is large when is small for large (mathematically in fraction)
It means high probability difference make bigger divergence.
  • Equals 0 when two distributions are identical, and greater than 0 otherwise (difference between expected information and result - prior, posterior)
  • Divergence means just difference
  • It diverges to if
    Support
    does not duplicate
  • Minimizing KL-divergence is equivalent to maximizing log likelihood
  • KL divergence is a Popular Distance
  • It is the value of cross-entropy minus entropy. It is not a true distance metric
  • In case of KL divergence, we have the correspondence between MLE and KL matching
  • has an analytic solution if both p and q follows the normal distribution
q is trained to cover p

Analytic KL divergence

Since KL Divergence yields different values depending on the order of comparison, when using KL Loss, we determine the order based on the objective. Closed form is differ from
Probability Distribution
.

Mode covering - Forward

Maximize when is large for covering distribution by minimizing KL divergence.

Mode seeking - Backward

Maximize when is large which converges into distribution by minimizing reverse KL
 
 
 
 
 
 

Recommendations