Improving the Mean-Field Approximation in Belief Networks Using Bahadur's Reparameterisation of the Multivariate Binary Distribution | Neural Processing Letters Skip to main content
Log in

Improving the Mean-Field Approximation in Belief Networks Using Bahadur's Reparameterisation of the Multivariate Binary Distribution

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

We develop a new extension to the Mean-Field approximation for inference in graphical models which has advantages over other approximation schemes which have been proposed. The method is economical in its use of variational parameters and the approximating conditional distribution can be specified with direct reference to the dependence structure of the variables in the graphical model. We apply the method to sigmoid belief networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Jordan, M. I., Ghahramani, Z., Jaakkola, T. and Saul, L. K.: An introduction to variational methods for graphical models, in: Jordan, M. I. (ed.), Learning in Graphical models, Kluwer, 1998, pp. 105–161.

  2. Saul, L. K., Jaakkola, T. and Jordan, M. I.: Mean-Field theory for sigmoid belief networks, J. Artif. Intel. Res. 4 (1996), 61–76.

    Google Scholar 

  3. Saul, L. K. and Jordan, M. I.: Exploiting tractable substructures in intractable networks, in: Touretzky, D. S., Mozer, M. C. and Hasselmo, M. E. (eds), Advances in Neural Information Processing Systems, Vol. 8, MIT Press, Cambridge, MA, 1996, pp. 486–492.

    Google Scholar 

  4. Bishop, C. M., Lawrence, N., Jaakkola, T. and Jordan, M. I.: Approximating posterior distributions in belief networks using mixtures, in: Jordan, M. I., Kearns, M. J. and Solla, S. A. (eds), Advances of Neural Information Processing Systems, Vol. 10, MIT Press, Cambridge, MA, 1998, pp. 416–422.

    Google Scholar 

  5. Bahadur, R. R.: A representation of the joint distribution of responses to n dichotomous items, in: Solomon, H. (ed.), Studies in Item Analysis and Prediction, Stanford University Press, Stanford, 1961, pp. 158–168.

    Google Scholar 

  6. Numerical Algorithms Group, The NAG Workstation Library Handbook-Release 1, 1986.

  7. Jaakkola, T. S. and Jordan, M. I.: Computing upper and lower bounds on likelihoods in intractable networks, Massachusetts Institute of Technology, C.B.C.L. Memo No.136/A.I. Memo No.1571, 1996.

  8. Humphreys, K. and Titterington, D. M.: The exploration of new methods for learning in binary Boltzmann machines, in: Heckerman, D. and Whittaker, J. (eds), Proc. Seventh Internat. Workshop on Artificial Intelligence and Statistics, Morgan Kaufmann, San Francisco, CA, 1999, pp. 209–214.

    Google Scholar 

  9. Kappen, H. J. and Rodriguez, F. B.: Efficient learning in Boltzmann machines using linear response theory, Neural Comput. 10 (1998), 1137–1156.

    Google Scholar 

  10. Dempster, A. P., Laird, N. M. and Rubin, D. B.: Maximum likelihood from incomplete data via the EM algorithm (with discussion), J. Roy. Statistical Soc. B, 39 (1977), 1–38.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Humphreys, K., Titterington, D.M. Improving the Mean-Field Approximation in Belief Networks Using Bahadur's Reparameterisation of the Multivariate Binary Distribution. Neural Processing Letters 12, 183–197 (2000). https://doi.org/10.1023/A:1009617914949

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009617914949

Navigation