여러 State를 갖는 Chain 형태의 구조
Markov chain is a model of stochastic evolution of the system captured in discrete snapshots. The Stochastic matrix describes the probabilities with which the system transits into different state. Therefore, You can start with a messy process which is not Stationary process but which will eventually converge to a well behaved Stationary process which is driven by only one probability law and your process can freely visit all states (Ergodicity) within state spaces without getting trapped in a loop.
The point is that the probability of a state at a particular time depends only on the immediately preceding state, following the Markov assumption, which is a discrete-time stochastic process.
At any point of the sequence, the marginal distribution is given by
Markov Chains