Abstract
This paper proposed a variational Bayesian approach for the SVM regression based on the likelihood model of an infinite mixture of Gaussians. To evaluate this approach the method was applied to synthetic datasets. We compared this new approximation approach with the standard SVM algorithm as well as other well established methods such as Gaussian Process.
Author to whom all the correspondences should be addressed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
C.M. Bishop. Variational principal components. In Proceedings pf Ninth International Conference on Artificial Neural Networks, ICANN’99, pages 509–514, 1999.
T. Evgeniou, M. Pontil, and T. Poggio. A unified framework for regularization networks and support vector machines. A.I. Memo 1654, AI Lab, MIT, Massachusetts, 1999.
J.H. Friedman. Multivariable adaptive regression splines. The Annals of Statistics, 19(1):1–57, 1991.
J.B. Gao, S.R. Gunn, C.J. Harris, and M.Q. Brown. A probabilistic framework for SVM regression and error bar estimation. Machine Learning, 46:71–89, 2002.
F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural networks architectures. Neural Computation, 7:219–269, 1995.
T.S. Jaakkola and M.I. Jordan. Variational probabilistic inference and the qmr-dt database. Journal of Artificial Intelligence Research, 10:291–322, 1999.
N.D. Lawrence. Variational inference in probabilistic models. PhD thesis, University of Cambridge, Cambridge, UK, 2000.
N.D. Lawrence and M. Azzouzi. A variational Bayesian committe of neural networks. Technical report, University of Cambridge, 2000.
N.D. Lawrence and C.M. Bishop. Variational Bayesian independent component analysis. Technical report, University of Cambridge, 2000.
D.J. MacKay. Gaussian processes, A replacement for neural networks. NIPS tutorial 1997, Cambridge University, 1997.
R. Neal. Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. Technical Report CRG-TR-97-2, Dept. of Computer Science, University of Toronto, 1997.
W.D. Penny and S.J. Roberts. Variational Bayes for non-Gaussian autoregressive models. Technical report, Department of Engineering Science, Oxford University, UK, 2000.
M. Pontil, S. Mukherjee, and F. Girosi. On the noise model of support vector machine regression. A.I. Memo 1651, AI Laboratory, MIT, 1998.
M. Seeger. Relationships between Gaussian processes, support vector machines and smoothing splines. Research report, Institute for Adaptive and Neural Computation, University of Edinburgh, Edinburgh, Scotland, 2000.
A.J. Smola. Learning with Kernels. PhD thesis, Technischen Universität Berlin, Berlin, Germany, 1998.
P. Sollich. Approximate learning curves for Gaussian processes. In ICANN99: Ninth International Conference on Artificial Neural Networks, pages 437–442, London, 1999. The Institution of Electrical Engineers.
A.N. Tikhonov and V.Y. Arsenin. Solution of Ill-posed Problems. W.H. Winston, Washington, D.C., 1977.
V.N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998.
G. Wahba. Splines Models for Observational Data, volume 59 of Series in Applied Mathematics. SIAM Press, Philadelphia, 1990.
C.K. Williams. Computing with infinite networks. In M.C. Mozer, M.I. Jordan, and T. Petsche, editors, Neural Information Processing Systems, volume 9, pages 295–301. MIT Press, 1997.
C.K. Williams. Prediction with gaussian processes: from linear regression to linear prediction and beyond. In M.I. Jordan, editor, Learning in Graphical Models, pages 599–621. MIT Press, Cambridge, Massachusetts, 1998.
C.K. Williams and D. Barber. Bayesian classification with gaussian processes. IEEE Trans. on Pattern Analysis and Machine Intelligence, 20:1342–1351, 1998.
C.K. Williams and C.E. Rasmuseen. Gaussian processes for regression. In D.S. Touretzky, M.C. Mozer, and M.E. Hasselmo, editors, Neural Information Processing Systems, volume 8, pages 514–520. MIT Press, 1997.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gao, J., Gunn, S., Kandola, J. (2002). Adapting Kernels by Variational Approach in SVM. In: McKay, B., Slaney, J. (eds) AI 2002: Advances in Artificial Intelligence. AI 2002. Lecture Notes in Computer Science(), vol 2557. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36187-1_35
Download citation
DOI: https://doi.org/10.1007/3-540-36187-1_35
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00197-3
Online ISBN: 978-3-540-36187-9
eBook Packages: Springer Book Archive