摘要

We develop the recursion for hidden Markov (HM) models proposed by Bartolucci and Besag (2002), and we show how it may be used to implement an estimation algorithm for these models that requires an amount of memory not depending on the length of the observed series of data. This recursion allows us to obtain the conditional distribution of the latent state at every occasion, given the previous state and the observed data. With respect to the estimation algorithm based on the well-known Baum-Welch recursions, which requires an amount of memory that increases with the sample size, the proposed algorithm also has the advantage of not requiring dummy renormalizations to avoid numerical problems. Moreover, it directly allows us to perform global decoding of the latent sequence of states, without the need of a Viterbi method and with a consistent reduction of the memory requirement with respect to the latter. The proposed approach is compared, in terms of computing time and memory requirement, with the algorithm based on the Baum-Welch recursions and with the so-called linear memory algorithm of Churbanov and Winters-Hilt. The comparison is also based on a series of simulations involving an HM model for continuous time-series data.

  • 出版日期2014-2-1
  • 单位Perugia