Abstract
It is well known that linear slack penalty SVM training is equivalent to solving the Nearest Point Problem (NPP) over the so-called μ-Reduced Convex Hulls, that is, convex combinations of the positive and negative samples with coefficients bounded by a μ < 1 value. In this work we give a simple approach to the classical Gilbert-Schlesinger-Kozinec (GSK) and Mitchell-Demyanov-Malozemov (MDM) algorithms for NPP that naturally leads to their updating pattern selection criteria and suggests an easy way to extend them to coefficient clipping algorithms for NPP over the μ-Reduced Convex Hulls. Although the extended GSK algorithm does not perform as well as the more complex recent proposal by Mavroforakis and Theodoridis, clipping MDM coefficient updates results in a fast and efficient algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Schölkopf, B., Smola, A.: Learning with Kernels: Support Vector Machines, Regularization, Optimization and beyond. MIT Press, Cambridge (2002)
Platt, J.: Fast Training of Support Vector Machines using Sequential Minimal Optimization. In: Advances in Kernel Methods – Support Vector Machines, pp. 185–208 (1999)
Joachims, T.: Making Large-Scale Support Vector Machine Learning Practical. In: Advances in Kernel Methods – Support Vector Machines, pp. 169–184 (1999)
Bennett, K., Bredensteiner, E.: Duality and Geometry in SVM Classifiers. In: 17th Int. Conf. on Machine Learning, pp. 57–64 (2000)
Keerthi, S., Shevade, S., Bhattacharyya, C., Murthy, K.: A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design. IEEE Trans. on Neural Networks 11(1), 124–136 (2000)
Franc, V., Hlaváč, V.: Simple Solvers for Large Quadratic Programming Tasks. In: Kropatsch, W.G., Sablatnig, R., Hanbury, A. (eds.) DAGM 2005. LNCS, vol. 3663, pp. 75–84. Springer, Heidelberg (2005)
Gilbert, E.: Minimizing the Quadratic Norm on a Convex Set. SIAM J. Contr. 4, 61–79 (1966)
Franc, V., Hlaváč, V.: An Iterative Algorithm Learning the Maximal Margin Classifier. Pattern Recognition 36, 1985–1996 (2003)
Mitchell, B., Dem’yanov, V., Malozemov, V.: Finding the Point of a Polyhedron Closest to the Origin. SIAM J. Contr. 12, 19–26 (1974)
Shawe-Taylor, J., Cristianini, N.: On the Generalisation of Soft Margin Algorithms. IEEE Trans. on Information Theory 48(10), 2711–2735 (2002)
Mavroforakis, M., Theodoridis, S.: A Geometric Approach to Support Vector Machine (SVM) Classification. IEEE Trans. on Neural Networks 17(3), 671–682 (2006)
Tao, Q., Wu, G.W., Wang, J.: A General Soft Method for Learning SVM Classifiers with L1-Norm Penalty. Pattern Recognition 41(3), 939–948 (2008)
Rätsch, G.: Benchmark Repository, http://ida.first.fraunhofer.de/projects/bench/benchmarks.htm
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
López, J., Barbero, Á., Dorronsoro, J.R. (2008). Simple Clipping Algorithms for Reduced Convex Hull SVM Training. In: Corchado, E., Abraham, A., Pedrycz, W. (eds) Hybrid Artificial Intelligence Systems. HAIS 2008. Lecture Notes in Computer Science(), vol 5271. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87656-4_46
Download citation
DOI: https://doi.org/10.1007/978-3-540-87656-4_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87655-7
Online ISBN: 978-3-540-87656-4
eBook Packages: Computer ScienceComputer Science (R0)