Minimize the negative log-likelihood
Here comes a Normalizing Flow (NF) model for distribution approximation. A normalizing flow transforms a simple distribution into a complex one by applying a sequence of invertible transformation functions.

Normalizing flow Notion
Flow-based Deep Generative Models
So far, I’ve written about two types of generative models, GAN and VAE. Neither of them explicitly learns the probability density function of real data, $p(\mathbf{x})$ (where $\mathbf{x} \in \mathcal{D}$) — because it is really hard! Taking the generative model with latent variables as an example, $p(\mathbf{x}) = \int p(\mathbf{x}\vert\mathbf{z})p(\mathbf{z})d\mathbf{z}$ can hardly be calculated as it is intractable to go through all possible values of the latent code $\mathbf{z}$.
https://lilianweng.github.io/posts/2018-10-13-flow-models/
Normalizing flow 설명
지난번에, GAN, judy-son.tistory.com/7 [리뷰] GAN, Generative Adversarial Nets 공부할 겸, 되짚어볼 겸 GAN 리뷰 시작해 본다! 시작!! 1. Basic Idea Generative Model 을 Estimation 할 때 Adversarial 하게 학습을 시키는 새로운 framework을 제시했다. Sample을 생성해 내는 Generator와,.. judy-son.tistory.com 그리고 VAE judy-son.tistory.com/11 [논문리뷰] VAE(Auto-Encoding Variational Bayes) Auto-Encoding Variational Bayes 너무나도 유명한 논문.. 공부할 겸 다시 읽었다. 읽을 때마다 새로운 느낌이..
https://judy-son.tistory.com/12

Seonglae Cho