How tree-what several sequences of observation

Language Processing pp 107-136 | Cite as

  • Beat Pfister
  • Tobias Kaufmann
Chapter
First Online:

Summary

The basics of the Hidden Markov Models (HMM) are dealt with in this chapter. These include, in particular, the algorithms that solve the fundamental problems of the HMM: the forward algorithm, the Viterbi algorithm and the Baum-Welch algorithm. The trellis diagram is used to explain these algorithms clearly. Since the HMMs used in speech processing can have discrete or continuous observation probability distributions, the associated algorithms are given for both cases.

Finally, a solution to the underflow problem is shown, which manifests itself in longer observation sequences because a large number of probabilities have to be multiplied with one another and the result thus tends towards zero.

This is a preview of subscription content, log in to check access.

Preview

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1. ETH Zurich, Zurich, Switzerland