Monte Carlo Method

Creator
Creator
Seonglae Cho
Created
Created
2022 Apr 3 15:45
Editor
Edited
Edited
2024 Dec 5 15:40

Monte Carlo Approximation, Monte Carlo Estimation, Generalized version of
The law of Large number

Approximation by repeated random sampling (probabilistic simulation)
Integration technique usually to transform discrete to continuous distribution. For example when we compute the expectation of f(X)f(X) where XX has density pp

Elementary Monte Carlo identity

Ep[f(X)]=f(x)p(x)dx\mathbb{E}_p[f(X)] = \int f(x)p(x)dx
Assume we have a way of drawing samples from density pp, we can approximate the integral above by (with access of function ff and
Distribution Sampling
)
f^=1mi=1mf(xi),w0p\hat{f} = \frac{1}{m}\sum_{i=1}^m f(x_i), \quad w_0 \sim p
This is unbiased estimator: Ef^=E[f(X)]\mathbb{E}\hat{f} = \mathbb{E}[f(X)] It approximates better as mm grows.

The stochasticity trick of Monte Carlo method (
Importance sampling
)

Assume you have a difficult integral to compute
f(x)dx=f(x)g(x)g(x)dx=Eg[f(X)g(X)]1mi=1mf(Xi)g(Xi),Xig\int f(x)dx = \int \frac{f(x)}{g(x)} g(x) dx = \mathbb{E}_g [\frac{f(X)}{g(X)}] \approx \frac{1}{m}\sum_{i=1}^m \frac{f(X_i)}{g(X_i)}, X_i \sim gEg[w(X)]1mi=1mw(Xi),w(X)=f(X)g(X)\mathbb{E}_g[w(X)] \approx \frac{1}{m}\sum_{i=1}^m w(X_i), w(X) = \frac{f(X)}{g(X)}
In short, Monte Carlo methods enable us to estimate any integral by random sampling. In
Bayesian Statistics
,
Evidence
is also form of integral so it becomes tractable.

Monte Carlo variance

V(f^)=V(1mi=1mf(Xi))=1m2V(i=1mf(Xi))\mathbb{V}(\hat{f}) = \mathbb{V}(\frac{1}{m}\sum_{i=1}^mf(X_i)) \\ = \frac{1}{m^2} \mathbb{V}(\sum_{i=1}^mf(X_i))
Since it is
iid
=1m2mV(f(X))=1mV(f(X)) = \frac{1}{m^2} m \mathbb{V}(f(X)) \\ = \frac{1}{m} \mathbb{V}(f(X))
In other words, as mm\rightarrow \infty, the variance goes to zero.
Monte Carlo Methods
 
 
 
 
 
 

Recommendations