Abstract
Latent structure models involve real, potentially observable variables and latent, unobservable variables. The framework includes various particular types of model, such as factor analysis, latent class analysis, latent trait analysis, latent profile models, mixtures of factor analysers, state-space models and others. The simplest scenario, of a single discrete latent variable, includes finite mixture models, hidden Markov chain models and hidden Markov random field models. The paper gives a brief tutorial of the application of maximum likelihood and Bayesian approaches to the estimation of parameters within these models, emphasising especially the fact that computational complexity varies greatly among the different scenarios. In the case of a single discrete latent variable, the issue of assessing its cardinality is discussed. Techniques such as the EM algorithm, Markov chain Monte Carlo methods and variational approximations are mentioned.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bartholomew, D.J.: The foundations of factor analysis. Biometrika 71, 221–232 (1984)
Gibson, W.A.: Three multivariate models: factor analysis, latent structure analysis and latent profile analysis. Psychometrika 24, 229–252 (1959)
Ghahramani, Z.: Factorial learning and the EM algorithm. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Advances in Neural Information Processing Systems, vol. 7. MIT Press, Cambridge (1996)
Bartholomew, D.J.: Latent Variable Models and Factor Analysis. Griffin, London (1987)
MacKay, D.J.C.: Bayesian neural networks and density networks. Instr. Meth. Phys. Res. A 354, 73–80 (1995)
Hagenaars, J.A.: Categorical Longitudinal Data. Sage, London (1990)
Neal, R.M.: Probabilistic inference using Markov chain Monte Carlo methods. Tech. Report CRG-TR-93-1, Dept. Comp. Sci., Univ. Toronto (1993)
Dunmur, A.P., Titterington, D.M.: Analysis of latent structure models with multidimensional latent variables. In: Kay, J.W., Titterington, D.M. (eds.) Statistics and Neural Networks: Recent Advances at the Interface, pp. 165–194. Oxford University Press, Oxford (1999)
Ghahramani, Z., Beal, M.: Variational inference for Bayesian mixtures of factor analyzers. In: Solla, S.A., Leen, T.K., Müller, K.-R. (eds.) Advances in Neural Information Processing, vol. 12, pp. 449–455. MIT Press, Cambridge (2000)
Fokoué, E., Titterington, D.M.: Mixtures of factor analysers: Bayesian estimation and inference by stochastic simulation. Machine Learning 50, 73–94 (2003)
Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Patt. Anal. Mach. Intell. 6, 721–741 (1984)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm (with discussion). J. R. Statist. Soc. B 39, 1–38 (1977)
Rabiner, L.: A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77, 257–285 (1989)
Younes, L.: Parameter estimation for imperfectly observed Gibbsian fields. Prob. Theory Rel. Fields 82, 625–645 (1989)
Geyer, C.J., Thompson, E.A.: Constrained Monte Carlo maximum likelihood for dependent data (with discussion). J.R. Statist. Soc. B 54, 657–699 (1992)
Qian, W., Titterington, D.M.: Estimation of parameters in hidden Markov models. Phil. Trans. R. Soc. Lond. A 337, 407–428 (1991)
Besag, J.E.: On the statistical analysis of dirty pictures (with discussion). J.R. Statist. Soc. B 48, 259–302 (1986)
Besag, J.E.: Statistical analysis of non-lattice data. The Statistician 24, 179–195 (1975)
Hall, P., Humphreys, K., Titterington, D.M.: On the adequacy of variational lower bounds for likelihood-based inference in Markovian models with missing values. J. R. Statist. Soc. B 64, 549–564 (2002)
Bishop, C.M., Lawrence, N., Jaakkola, T.S., Jordan, M.I.: Approximating posterior distributions in belief networks using mixtures. In: Jordan, M.I., Kearns, M.J., Solla, S.A. (eds.) Advances in Neural Information Processing Systems, vol. 10, pp. 416–422. MIT Press, Cambridge (1998)
Humphreys, K., Titterington, D.M.: Improving the mean field approximation in belief networks using Bahadur’s reparameterisation of the multivariate binary distribution. Neural Processing Lett. 12, 183–197 (2000)
Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational approximations. Technical Report 649, Dept. Statistics, Univ. California, Berkeley (2003)
Jordan, M.I., Gharamani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. In: Jordan, M. (ed.) Learning in Graphical Models, pp. 105–162. MIT Press, Cambridge (1999)
Zhang, J.: The Mean Field Theory in EM procedures for Markov random fields. IEEE Trans. Signal Processing 40, 2570–2583 (1992)
Zhang, J.: The Mean Field Theory in EM procedures for blind Markov random field image restoration. IEEE Trans. Image Processing 2, 27–40 (1993)
Robert, C.P.: The Bayesian Choice, 2nd edn. Springer, Heidelberg (2001)
Tierney, L., Kadane, J.B.: Accurate approximations to posterior moments and marginal densities. J. Amer. Statist. Assoc. 81, 82–86 (1986)
Diebolt, J., Robert, C.P.: Estimation of finite mixture distributions through Bayesian sampling. J.R. Statist. Soc. B 56, 363–375 (1994)
Robert, C.P., Celeux, G., Diebolt, J.: Bayesian estimation of hidden Markov chains: a stochastic implementation. Statist. Prob. Lett. 16, 77–83 (1993)
Rydén, T., Titterington, D.M.: Computational Bayesian analysis of hidden Markov models. J. Comp. Graph. Statist. 7, 194–211 (1998)
Gelfand, A.E., Smith, A.F.M.: Sampling-based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85, 398–409 (1990)
Gilks, W.R., Richardson, S., Spiegelhalter, D.J. (eds.): Markov Chain Monte Carlo in Practice. Chapman and Hall, Boca Raton
Doucet, A., de Freitas, N., Gordon, N. (eds.): Sequential Monte Carlo Methods in Practice. Springer, Heidelberg
Murray, I., Ghahramani, Z.: Bayesian learning in undirected graphical models: approximate MCMC algorithms. In: Chickering, M., Halperin, J. (eds.) Proc. 20th Conf. Uncertainty in Artificial Intell., pp. 577–584. AUAI Press (2004)
Corduneanu, A., Bishop, C.M.: Variational Bayesian model selection for mixture distributions. In: Richardson, T., Jaakkola, T. (eds.) Proc. 8th Int. Conf. Artific. Intell. Statist., pp. 27–34. Morgan Kaufmann, San Mateo (2001)
Ueda, N., Ghahramani, Z.: Bayesian model search for mixture models based on optimizing variational bounds. Neural Networks 15, 1223–1241 (2003)
MacKay, D.J.C.: Ensemble learning for hidden Markov models. Technical Report, Cavendish Lab., Univ. Cambridge (1997)
McGrory, C.A.: Ph.D. Dissertation, Dept. Statist., Univ. Glasgow (2005)
Wang, B., Titterington, D.M.: Convergence properties of a general algorithm for calculating variational Bayesian estimates for a normal mixture model. Bayesian Analysis 1 (to appear, 2006)
Wang, B., Titterington, D.M.: Variational Bayes estimation of mixing coefficients. In: Winkler, J.R., Niranjan, M., Lawrence, N.D. (eds.) Deterministic and Statistical Methods in Machine Learning. LNCS, vol. 3635, pp. 281–295. Springer, Heidelberg (2005)
Titterington, D.M.: Bayesian methods for neural networks and related models. Statist. Sci. 19, 128–139 (2004)
Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Csaki, F. (eds.) Proc. 2nd Int. Symp. Info. Theory, pp. 267–281. Akadémiai Kiadó, Budapest (1973)
Schwarz, G.: Estimating the dimension of a model. Ann. Statist. 6, 461–466 (1978)
McLachlan, G.J.: On bootstrapping the likelihood ratio test statistics for the number of components in a normal mixture. Appl. Statist. 36, 318–324 (1987)
Keribin, C.: Consistent estimation of the order of mixture models. Sankhya A 62, 49–66 (2000)
Kass, R.E., Raftery, A.: Bayes factors. J. Amer. Statist. Assoc. 90, 773–795 (1995)
Spiegelhalter, D.J., Best, N.G., Carlin, B.P., van der Linde, A.: Bayesian measures of complexity and fit (with discussion). J. R. Statist. Soc. B 64, 583–639 (2002)
Celeux, G., Forbes, F., Robert, C.P., Titterington, D.M.: Deviation information criteria for missing data models (submitted, 2005)
Green, P.J.: Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82, 711–732 (1995)
Richardson, S., Green, P.J.: On Bayesian analysis of mixtures with an unknown number of components (with discussion). J. R. Statist. Soc. B 59, 731–792 (1997)
Robert, C.P., Rydén, T., Titterington, D.M.: Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method. J. R. Statist. Soc. B 62, 57–75 (2000)
Green, P.J., Richardson, S.: Hidden Markov models and disease mapping. J. Amer. Statist. Assoc. 97, 1055–1070 (2002)
Stephens, M.: Bayesian analysis of mixtures with an unknown number of components - an alternative to reversible jump methods. Ann. Statist. 28, 40–74 (2000)
Cappé, O., Robert, C.P., Rydén, T.: Reversible jump, birth-and-death and more general continuous time Markov chain Monte Carlo. J. R. Statist. Soc. B 65, 679–699 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Titterington, D.M. (2006). Some Aspects of Latent Structure Analysis. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_4
Download citation
DOI: https://doi.org/10.1007/11752790_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34137-6
Online ISBN: 978-3-540-34138-3
eBook Packages: Computer ScienceComputer Science (R0)