Hierarchical Gaussian process mixtures for regression | Statistics and Computing Skip to main content
Log in

Hierarchical Gaussian process mixtures for regression

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

As a result of their good performance in practice and their desirable analytical properties, Gaussian process regression models are becoming increasingly of interest in statistics, engineering and other fields. However, two major problems arise when the model is applied to a large data-set with repeated measurements. One stems from the systematic heterogeneity among the different replications, and the other is the requirement to invert a covariance matrix which is involved in the implementation of the model. The dimension of this matrix equals the sample size of the training data-set. In this paper, a Gaussian process mixture model for regression is proposed for dealing with the above two problems, and a hybrid Markov chain Monte Carlo (MCMC) algorithm is used for its implementation. Application to a real data-set is reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Carlin B.P. and Louis T.A. 2000. Bayes and Empirical Bayes Methods for Data Analysis, 2nd edition. Chapman & Hall/CRC, London.

    Google Scholar 

  • Cheng B. and Titterington D.M. 1994. Neural networks: A review from a statistical perspective (with discussion). Statistical Science 9: 2–54.

    Google Scholar 

  • Duane S., Kennedy A.D., and Roweth D. 1987. Hybrid Monte Carlo. Physics Letters B 195: 216–222.

    Google Scholar 

  • Gelman A. 1996. Inference and monitoring convergence. In: Gilks W.R., Richardson S., and Spiegelhalter D.J. (Eds.), Markov Chain Monte Carlo in Practice, Chapman Hall, London, pp. 131–144.

    Google Scholar 

  • Geman S. and Geman D. 1984. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence 6: 721–741.

    Google Scholar 

  • Gibbs M.N. 1997. Bayesian Gaussian Processes for Regression and Classification. PhD thesis, Cambridge University. (Available from http://wol.ra.phy.cam.ac.uk/mng10/GP/)

  • Gibbs M.N. and MacKay D.J.C. 1996. Efficient implementation of Gaussian processes for interpolation. Technical report. Cambridge University. (Available from http://wol.ra.phy.cam.ac.uk/mackay/ GP/)

  • Horowitz A.M. 1991. A generalized guided Monte Carlo algorithm. Physics Letters B 268: 247–252.

    Google Scholar 

  • Kamnik R., Bajd T., and Kralj A. 1999. Functional electrical stimulation and arm supported sit-to-stand transfer after paraplegia: A study of kinetic parameters. Artificial Organs 23: 413–417.

    Google Scholar 

  • Kamnik R., Shi, J.Q., Murray-Smith, R., and Bajd T. 2003. Feedback information in FES supported standing-up in paraplegia. Technical Report. University of Glasgow. (Available from http://www.staff.ncl.ac.uk/j.q.shi/ps/roman.pdf).

  • Lemm J.C. 1999. Mixtures of Gaussian process priors. In: Proceedings of the Ninth International Conference on Artificial Neural Networks (ICANN99), IEE Conference Publication No. 470. Institution of Electrical Engineers, London.

  • MacKay D.J.C. 1999. Introduction to Gaussian processes. Technical report. Cambridge University. (Available from http://wol.ra. phy.cam.ac.uk /mackay/GP/)

  • McLachlan G.J. and Peel D. 2000. Finite Mixture Distributions. Wiley, New York.

    Google Scholar 

  • Neal R.M. 1997. Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. Technical report 9702. Dept of Computing Science, University of Toronto. (Available from http://www.cs.toronto.edu/ ∼radford/)

  • O’Hagan A. 1978. On curve fitting and optimal design for regression (with discussion). Journal of the Royal Statistical Society B 40: 1–42.

    Google Scholar 

  • Ramsay J.O. and Silverman B.W. 1997. Functional Data Analysis. Springer, New York.

    Google Scholar 

  • Rasmussen C.E. 1996. Evaluation of Gaussian Processes and Other Methods for Non-linear Regression. PhD Thesis. University of Toronto. (Available from http://bayes.imm.dtu.dk)

  • Rasmussen C.E. and Ghahramani Z. 2002. Infinite mixtures of Gaussian process experts. In: Dietterich T., Becker S., and Ghahramani Z. (Eds.), Advances in Neural Information Processing Systems 14, MIT Press.

  • Richardson S. and Green P.J. 1997. On Bayesian analysis of mixtures with an unknown number of components (with discussion). Journal of the Royal Statistical Society B 59: 731–758.

    Google Scholar 

  • Stephens M. 2000. Bayesian analysis of mixture models with an unknown number of components—an alternative to reversible jump methods. The Annals of Statistics 28: 40–74.

    Google Scholar 

  • Thompson T.J., Smith P.J., and Boyle J.P. 1998. Finite mixture models with concomitant information: Assessing diagnostic criteria for diabetes. Applied Statistics 47: 393–404.

    Google Scholar 

  • Titterington D.M., Smith A.F.M., and Makov U.E. 1985. Statistical Analysis of Finite Mixture Distribution. Wiley, Chichester, New York.

    Google Scholar 

  • Tresp V. 2000. The Bayesian committee machine. Neural Computation 12: 2719–2741.

    Google Scholar 

  • Tresp V. 2001. Mixtures of Gaussian processes. In: Leen T.K., Diettrich T.G., and Tresp V. (Eds.), Advances in Neural Information Processing Systems, 13, MIT Press.

  • Williams C.K.I. 1998. Prediction with Gaussian processes: From linear regression to linear prediction and beyond. In: Jordan M.I. (Ed.), Learning and Inference in Graphical Models, Kluwer, pp. 599–621.

  • Williams C.K.I. and Rasmussen C.E. 1996. Gaussian process for regression. In: Touretzky D.S. et al. (Eds.), Advances in Neural Information Processing Systems 8, MIT Press.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J.Q. Shi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Shi, J., Murray-Smith, R. & Titterington, D. Hierarchical Gaussian process mixtures for regression. Stat Comput 15, 31–41 (2005). https://doi.org/10.1007/s11222-005-4787-7

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11222-005-4787-7

Keywords

Navigation