Faster algorithms for max-product message-passing

Julian J. McAuley, Tibério S. Caetano

    Research output: Contribution to journalArticlepeer-review

    15 Citations (Scopus)


    Maximum A Posteriori inference in graphical models is often solved via message-passing algorithms, such as the junction-tree algorithm or loopy belief-propagation. The exact solution to this problem is well-known to be exponential in the size of the maximal cliques of the triangulated model, while approximate inference is typically exponential in the size of the model's factors. In this paper, we take advantage of the fact that many models have maximal cliques that are larger than their constituent factors, and also of the fact that many factors consist only of latent variables (i.e., they do not depend on an observation). This is a common case in a wide variety of applications that deal with grid-, tree-, and ring-structured models. In such cases, we are able to decrease the exponent of complexity for message-passing by 0.5 for both exact and approximate inference. We demonstrate that message-passing operations in such models are equivalent to some variant of matrix multiplication in the tropical semiring, for which we offer an 0(N 2.5) expected-case solution.

    Original languageEnglish
    Pages (from-to)1349-1388
    Number of pages40
    JournalJournal of Machine Learning Research
    Publication statusPublished - Apr 2011


    Dive into the research topics of 'Faster algorithms for max-product message-passing'. Together they form a unique fingerprint.

    Cite this