How we confident about hypothesis after seeing evidence
Posterior describes the probability of a hypothesis given data. Posterior is a decision made after observing data, while prior is a decision based on information or experience already known before observing the data. Therefore, Posterior estimation is a key aspect of Machine Learning Once we know the posterior, we can determine the label from the data.
The posterior density is often intractable due to the complexity of the data likelihood. This intractability arises from the integral involved in calculating the likelihood. In essence, the posterior probability represents the updated probability of an event occurring after new evidence or information has been considered.
Variational Inference 알아보기 - MLE, MAP부터 ELBO까지
확률 분포를 근사 추정하는 기법인 Variational Inference를 이해하고 싶은 사람들을 위해, 확률 분포를 추정하는 근본적인 이유를 알려드립니다. 또한 MLE, MAP, KL divergence, ELBO 등 자주 등장하는 용어들을 설명합니다.
https://modulabs.co.kr/blog/variational-inference-intro/


Seonglae Cho