Hidden Markov model

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2024 Oct 21 10:56
Editor
Edited
Edited
2025 Aug 6 1:20
Refs
Refs

HMM

Hidden state
HMMs can explain observed data while inferring hidden internal structures that are not directly visible.
  • transition probability
  • emission probability
  • initial probability
 
 
Hidden Markov model
A hidden Markov model (HMM) is a Markov model in which the observations are dependent on a latent (or hidden) Markov process (referred to as X {\displaystyle X} ). An HMM requires that there be an observable process Y {\displaystyle Y} whose outcomes depend on the outcomes of X {\displaystyle X} in a known way. Since X {\displaystyle X} cannot be observed directly, the goal is to learn about state of X {\displaystyle X} by observing Y {\displaystyle Y} . By definition of being a Markov model, an HMM has an additional requirement that the outcome of Y {\displaystyle Y} at time t = t 0 {\displaystyle t=t_{0}} must be "influenced" exclusively by the outcome of X {\displaystyle X} at t = t 0 {\displaystyle t=t_{0}} and that the outcomes of X {\displaystyle X} and Y {\displaystyle Y} at t < t 0 {\displaystyle t<t_{0}} must be conditionally independent of Y {\displaystyle Y} at t = t 0 {\displaystyle t=t_{0}} given X {\displaystyle X} at time t = t 0 {\displaystyle t=t_{0}} . Estimation of the parameters in an HMM can be performed using maximum likelihood estimation. For linear chain HMMs, the Baum–Welch algorithm can be used to estimate parameters.

Training Dynamics

Identifying and addressing unstable dynamics early on, potentially by restarting training as
arxiv.org
 

 

Backlinks

Belief State

Recommendations