Abstract
Kernel parameters optimization is one of the most challenging problems on kernel Fisher discriminant analysis (KFDA). In this paper, a simple and effective KFDA kernel parameters optimization criterion is proposed on the basis of the maximum margin criterion (MMC) that maximize the distances between any two classes. Actually, this MMC-based criterion is applied to the kernel parameters optimization on KFDA and KFDA with Locally Linear Embedding affinity matrix (KFDA-LLE). It is demonstrated by the experiments on six real-world multiclass datasets that, in comparison with two other criteria, our MMC-based criterion can detect the optimal KFDA kernel parameters more accurately in the cases of both RBF kernel and polynomial kernel.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fukunnaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, Boston (1990)
Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation 12, 2385–2404 (2000)
Müller, K., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An Introduction to Kernel-Based Learning Algorithms. IEEE Transactions on Neural Networks 12(2), 181–201 (2001)
Tenenbaum, J.B., Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)
Taylor, J.S., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, London (2004)
Liu, J., Zhao, F., Liu, Y.: Learning Kernel Parameters for Kernel Fisher Discriminant Analysis. Pattern Recognition Letters 34, 1026–1031 (2013)
Huang, J., Chen, X., et al.: Kernel Parameter Optimization for Kernel-based LDA methods. In: International Joint Conference on Neural Networks, pp. 3840–3846. IEEE Press, Hong Kong (2008)
Hartigan, J.A.: Clustering Algorithms. Wiley, New York (1975)
Millgan, G., Cooper, M.: An Examination of Procedures for Determining The Number of Clusters in A Data Set. Pyschometrika 50(2), 159–179 (1985)
Friedman, H.P., Rubin, J.: On Some Invariant Criteria for Grouping Data. Journal of the American Statistical Association 62, 1159–1178 (1967)
Li, H., Jiang, T., Zhang, K.: Efficient and Robust Feature Extraction by Maximum Margin Criterion. IEEE Transactions on Neural Networkd 17(1), 157–165 (2006)
Sugiyama, M.: Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis. Journal of Machine Learning Research 8, 1027–1061 (2007)
Zhao, Y., Ma, J.: Local Fisher Discriminant Analysis with Locally Linear Embedding Affinity Matrix. In: Guo, C., Hou, Z.-G., Zeng, Z. (eds.) ISNN 2013, Part I. LNCS, vol. 7951, pp. 471–478. Springer, Heidelberg (2013)
Orloci, L.: An Agglomerative Method for Classification of Plant Communities. Journal of Ecology 55, 193–206 (1967)
UCI Machine Learning Repository, http://mlearn.ics.uci.edu/databases
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Zhao, Y., Ma, J. (2014). Kernel Parameter Optimization for KFDA Based on the Maximum Margin Criterion. In: Zeng, Z., Li, Y., King, I. (eds) Advances in Neural Networks – ISNN 2014. ISNN 2014. Lecture Notes in Computer Science(), vol 8866. Springer, Cham. https://doi.org/10.1007/978-3-319-12436-0_37
Download citation
DOI: https://doi.org/10.1007/978-3-319-12436-0_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12435-3
Online ISBN: 978-3-319-12436-0
eBook Packages: Computer ScienceComputer Science (R0)