Access Restriction

Author Ghahramani, Zoubin ♦ Beal, Matthew J.
Source CiteSeerX
Content type Text
Publisher MIT Press
File Format PDF
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Propagation Algorithm ♦ Bayesian Learning ♦ Variational Update ♦ Bayesian Analysis ♦ Conjugate-exponential Graphical Model ♦ Graphical Model ♦ State-space Model ♦ Theoretical Result ♦ Real High-dimensional Data Set ♦ Widespread Tool ♦ Belief Propagation ♦ General Family ♦ Hidden State Dimensionality ♦ Linear-gaussian State-space Model ♦ Variational Bayesian Learning ♦ Variational Approximation ♦ Model Parameter ♦ Synthetic Problem ♦ Learning Procedure ♦ Inference Step
Description In Advances In Neural Information Processing Systems 13
Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these results to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smoothing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set. 1
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article
Publisher Date 2001-01-01