Skip to Main Content
The simplest hybrid Bayesian network is Conditional Linear Gaussian (CLG). It is a hybrid model for which exact inference can be performed by the Junction Tree (JT) algorithm. However, the traditional JT only provides the exact first two moments for hidden continuous variables. In general, the complexity of exact inference algorithms is exponential in the size of the largest clique of the strongly triangulated graph that is usually the one including all of discrete parent nodes for a connected continuous component in the model. Furthermore, for the general nonlinear non-Gaussian hybrid model, it is well-known that no exact inference is possible. This paper introduces a new inference approach by unifying message passing between different types of variables. This algorithm is able to provide an exact solution for polytree CLG, and approximate solution by loopy propagation for general hybrid models. To overcome the exponential complexity, we use Gaussian mixture reduction methods to approximate the original density and make the algorithm scalable. This new algorithm provides not only the first two moments, but full density estimates. Empirically, approximation errors due to reduced Gaussian mixtures and loopy propagation are relatively small, especially for nodes that are far away from the discrete parent nodes. Numerical experiments show encouraging results.
Information Fusion (FUSION), 2010 13th Conference on
Date of Conference: 26-29 July 2010