KL Divergence

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2023 Mar 23 2:20
Editor
Edited
Edited
2024 Nov 15 15:41

Relative Entropy, Kullback Leibler Divergence, I-divergence

a metric to compare two distributions (asymmetric metric)

Divergence is large when is small for large (mathematically in fraction)
It means high probability difference make bigger divergence.
  • 두 분포가 같다면 0, 그 이외의 경우에는 0보다 크다 (즉 기대한 정보와 결과의 차이 prior, posterior)
  • Divergence means just difference
  • KL-divergence를 minimize하는 것 또한 결국 log likelihood를 maximize하는 것과 같다
  • KL divergence is a Popular Distance
  • Cross-entropy에서 entropy를 뺀 값. 거리 개념이 아니다
  • In case of KL divergence, we have the correspondence between MLE and KL matching
  • has an analytic solution if both p and q follows the normal distribution

Analytic KL divergence

closed form is differ from
Probability Distribution
 
 

Mode covering

Maximize when is large for covering distribution by minimizing KL divergence.
 
 

Mode seeking

Maximize when is large which converges into distribution by minimizing reverse KL
 
 
 
 
 
 

Recommendations