Markov Chain

Creator
Creator
Seonglae Cho
Created
Created
2022 Apr 3 15:15
Editor
Edited
Edited
2025 Mar 8 16:16

A sequence with a memoryless property with several chained states

Markov chains are abstractions of
Random Walk
. (
Non-Deterministic Turing Machine
). A Markov chain consists of nn states, plus an n×nn\times n transition probability matrix PP.
  • At each step, we are in exactly one of the states
  • For 1i,jn1\le i,j \le n, the matrix entry PijP_{ij} tells us the probability of jj being the next state, given we are currently in state ii where j=1nPij=1\sum_{j=1}^n P_{ij} = 1.
Markov chain is a model of stochastic evolution of the system captured in discrete snapshots. The
Stochastic matrix
describes the probabilities with which the system transits into different state. Therefore, You can start with a messy process which is not
Stationary process
but which will eventually converge to a well behaved
Stationary process
which is driven by only one probability law and your process can freely visit all states (
Ergodicity
) within state spaces without getting trapped in a loop.
The point is that the probability of a state at a particular time depends only on the immediately preceding state, following the Markov assumption, which is a discrete-time stochastic process.
At any point xnx_n of the sequence, the marginal distribution is given by xn1p(xnxn1)p(xn1)\sum_{x_{n-1}}p(x_n|x_{n-1})\cdot p(x_{n-1})
Markov Chain Notion
 
 
 
 
 

Recommendations