Hidden Markov model
A hidden Markov model (HMM) is a Markov model in which the observations are dependent on a latent (or hidden) Markov process (referred to as
X
{\displaystyle X}
). An HMM requires that there be an observable process
Y
{\displaystyle Y}
whose outcomes depend on the outcomes of
X
{\displaystyle X}
in a known way. Since
X
{\displaystyle X}
cannot be observed directly, the goal is to learn about state of
X
{\displaystyle X}
by observing
Y
{\displaystyle Y}
. By definition of being a Markov model, an HMM has an additional requirement that the outcome of
Y
{\displaystyle Y}
at time
t
=
t
0
{\displaystyle t=t_{0}}
must be "influenced" exclusively by the outcome of
X
{\displaystyle X}
at
t
=
t
0
{\displaystyle t=t_{0}}
and that the outcomes of
X
{\displaystyle X}
and
Y
{\displaystyle Y}
at
t
<
t
0
{\displaystyle t<t_{0}}
must be conditionally independent of
Y
{\displaystyle Y}
at
t
=
t
0
{\displaystyle t=t_{0}}
given
X
{\displaystyle X}
at time
t
=
t
0
{\displaystyle t=t_{0}}
. Estimation of the parameters in an HMM can be performed using maximum likelihood estimation. For linear chain HMMs, the Baum–Welch algorithm can be used to estimate parameters.
https://en.wikipedia.org/wiki/Hidden_Markov_model