Abstract
Regularization networks represent an important supervised learning method applicable for regression and classification tasks. They benefit from very good theoretical background, although the presence of meta parameters is their drawback. The meta parameters, including the type of kernel function, are typically supposed to be given in advance and come ready as an input of the algorithm. In this paper, we propose multi-kernel functions, namely product kernel functions and composite kernel functions. The choice of kernel function becomes part of the optimization process, for which a new evolutionary learning algorithm is introduced that deals with different kernel functions, including composite kernels. The results are demonstrated on experiments with benchmark tasks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Girosi, F., Jones, M., Poggio, T.: Regularization theory and Neural Networks architectures. Neural Computation 2, 219–269 (1995)
Kůrková, V.: Learning from data as an inverse problem. In: Antoch, J. (ed.) Computational Statistics, pp. 1377–1384. Physica Verlag, Heidelberg (2004)
Poggio, T., Girosi, F.: A theory of networks for approximation and learning. Technical report, Cambridge, MA, USA (1989); A. I. Memo No. 1140, C.B.I.P. Paper No. 31
Poggio, T., Smale, S.: The mathematics of learning: Dealing with data. Notices of the AMS 50, 536–544 (2003)
Neruda, R., Vidnerová, P.: Genetic algorithm with species for regularization network metalearning. In: Papasratorn, B., Lavangnananda, K., Chutimaskul, W., Vanijja, V. (eds.) IAIT 2010. CCIS, vol. 114, pp. 192–201. Springer, Heidelberg (2010)
Kudová, P., Šámalová, T.: Product kernel regularization networks. In: Ribeiro, B., Albrecht, R.F., Dobnikar, A., Pearson, D.W., Steele, N.C. (eds.) Adaptive and Natural Computing Algorithms, pp. 433–436. Springer, Wien (2005)
Kudová, P., Šámalová, T.: Sum and product kernel regularization networks. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds.) ICAISC 2006. LNCS (LNAI), vol. 4029, pp. 56–65. Springer, Heidelberg (2006)
Mitchell, M.: An Introduction to Genetic Algorithms. MIT Press, Cambridge (1996)
Prechelt, L.: PROBEN1 – a set of benchmarks and benchmarking rules for neural network training algorithms. Technical Report 21/94, Universitaet Karlsruhe (September 1994)
LAPACK: Linear algebra package, http://www.netlib.org/lapack/
Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vidnerová, P., Neruda, R. (2011). Evolutionary Learning of Regularization Networks with Multi-kernel Units. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds) Advances in Neural Networks – ISNN 2011. ISNN 2011. Lecture Notes in Computer Science, vol 6675. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21105-8_62
Download citation
DOI: https://doi.org/10.1007/978-3-642-21105-8_62
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21104-1
Online ISBN: 978-3-642-21105-8
eBook Packages: Computer ScienceComputer Science (R0)