Abstract
This tutorial describes the mean-field variational Bayesian approximation to inference in graphical models, using modern machine learning terminology rather than statistical physics concepts. It begins by seeking to find an approximate mean-field distribution close to the target joint in the KL-divergence sense. It then derives local node updates and reviews the recent Variational Message Passing framework.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Attias H (2000) A variational Bayesian framework for graphical models. In: Advances in neural information processing systems. MIT Press
Bernardo JM, Smith AFM (2000) Bayesian theory. Wiley, London
Bishop CM, Winn JM, Spiegelhalter D (2002) VIBES: a variational inference engine for Bayesian networks. In: Advances in neural information processing systems
Winn J, Bishop C (2005) Variational message passing. J Mach Learn Res 6: 661–694
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fox, C.W., Roberts, S.J. A tutorial on variational Bayesian inference. Artif Intell Rev 38, 85–95 (2012). https://doi.org/10.1007/s10462-011-9236-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10462-011-9236-8