Lumpable hidden Markov models - model reduction and reduced complexity filtering

Langford B. White, Robert Mahony, Gary D. Brushe

Research output: Contribution to journalArticlepeer-review

39 Citations (Scopus)


This paper is concerned with filtering of hidden Markov processes (HMPs) which possess (or approximately possess) the property of lumpability. This property is a generalization of the property of lumpability of a Markov chain which has been previously addressed by others. In essence, the property of lumpability means that there is a partition of the (atomic) states of the Markov chain into aggregated sets which act in a similar manner as far as the state dynamics and observation statistics are concerned. We prove necessary and sufficient conditions on the HMP for exact lumpability to hold. For a particular class of hidden Markov models (HMMs), namely finite output alphabet models, conditions for lumpability of all HMPs representable by a specified HMM are given. The corresponding optimal filter algorithms for the aggregated states are then derived. The paper also describes an approach to efficient suboptimal filtering for HMPs which are approximately lumpable. By this we mean that the HMM generating the process may be approximated by a lumpable HMM. This approach involves directly finding a lumped HMM which approximates the original HMM well, in a matrix norm sense. An alternative approach for model reduction based on approximating a given HMM by an exactly lumpable HMM is also derived. This method is based on the alternating convex projections algorithm. Some simulation examples are presented which illustrate the performance of the suboptimal filtering algorithms.

Original languageEnglish
Pages (from-to)2297-2306
Number of pages10
JournalIEEE Transactions on Automatic Control
Issue number12
Publication statusPublished - Dec 2000
Externally publishedYes


Dive into the research topics of 'Lumpable hidden Markov models - model reduction and reduced complexity filtering'. Together they form a unique fingerprint.

Cite this