PAC Bayes theorem

Creator
Creator
Seonglae Cho
Created
Created
2023 Nov 21 11:27
Editor
Edited
Edited
2025 Jun 2 16:27
Refs
Refs

While
Bayesian inference
pairs unique prior and posterior, PAC bayes is completely model free

PAC-Bayes is a generic framework to efficiently rethink generalization for numerous machine learning algorithms. It leverages the flexibility of Bayesian learning and allows to derive new learning algorithms. An important component in the PAC-Bayes analysis is the choice of the prior distribution.
When using a probabilistic classifier, it mathematically proves the necessity of keeping the prior and posterior distributions as similar as possible.
This is based on the KL Divergence between the distributions of expected empirical error and expected generalization error.
KL Divergence
is a measure of closeness, and as seen in
VC dimension
, the
Generalization Error
follows the curse of dimensionality (proportional to data dimension and inversely proportional to confidence)
notion image
 
 
 
 
 
 

Recommendations