Markov Chain

Created
Created
2022 Apr 3 15:15
Editor
Creator
Creator
Seonglae ChoSeonglae Cho
Edited
Edited
2025 Mar 21 16:55

Markovian

A sequence with a memoryless property with several chained states

Markov chains are abstractions of
Random Walk
. (
Non-Deterministic Turing Machine
). A Markov chain consists of states, plus an transition probability matrix .
  • At each step, we are in exactly one of the states
  • For , the matrix entry tells us the probability of being the next state, given we are currently in state where .
Markov chain is a model of stochastic evolution of the system captured in discrete snapshots. The
Stochastic matrix
describes the probabilities with which the system transits into different state. Therefore, You can start with a messy process which is not
Stationary process
but which will eventually converge to a well behaved
Stationary process
which is driven by only one probability law and your process can freely visit all states (
Ergodicity
) within state spaces without getting trapped in a loop.
The point is that the probability of a state at a particular time depends only on the immediately preceding state, following the Markov assumption, which is a discrete-time stochastic process.
At any point of the sequence, the marginal distribution is given by
Markov Chain Notion
 
 
 
 
 

Recommendations