TY - JOUR

T1 - Faster algorithms for max-product message-passing

AU - McAuley, Julian J.

AU - Caetano, Tibério S.

PY - 2011/4

Y1 - 2011/4

N2 - Maximum A Posteriori inference in graphical models is often solved via message-passing algorithms, such as the junction-tree algorithm or loopy belief-propagation. The exact solution to this problem is well-known to be exponential in the size of the maximal cliques of the triangulated model, while approximate inference is typically exponential in the size of the model's factors. In this paper, we take advantage of the fact that many models have maximal cliques that are larger than their constituent factors, and also of the fact that many factors consist only of latent variables (i.e., they do not depend on an observation). This is a common case in a wide variety of applications that deal with grid-, tree-, and ring-structured models. In such cases, we are able to decrease the exponent of complexity for message-passing by 0.5 for both exact and approximate inference. We demonstrate that message-passing operations in such models are equivalent to some variant of matrix multiplication in the tropical semiring, for which we offer an 0(N 2.5) expected-case solution.

AB - Maximum A Posteriori inference in graphical models is often solved via message-passing algorithms, such as the junction-tree algorithm or loopy belief-propagation. The exact solution to this problem is well-known to be exponential in the size of the maximal cliques of the triangulated model, while approximate inference is typically exponential in the size of the model's factors. In this paper, we take advantage of the fact that many models have maximal cliques that are larger than their constituent factors, and also of the fact that many factors consist only of latent variables (i.e., they do not depend on an observation). This is a common case in a wide variety of applications that deal with grid-, tree-, and ring-structured models. In such cases, we are able to decrease the exponent of complexity for message-passing by 0.5 for both exact and approximate inference. We demonstrate that message-passing operations in such models are equivalent to some variant of matrix multiplication in the tropical semiring, for which we offer an 0(N 2.5) expected-case solution.

KW - Belief-propagation

KW - Graphical models

KW - Tropical matrix multiplication

UR - http://www.scopus.com/inward/record.url?scp=79955823916&partnerID=8YFLogxK

M3 - Article

SN - 1532-4435

VL - 12

SP - 1349

EP - 1388

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

ER -