Abstract
The new C-Mantec algorithm constructs compact neural network architectures for classsification problems, incorporating new features like competition between neurons and a built-in filtering stage of noisy examples. It was originally designed for tackling two class problems and in this work the extension of the algorithm to multiclass problems is analyzed. Three different approaches are investigated for the extension of the algorithm to multi-category pattern classification tasks: One-Against-All (OAA), One-Against-One (OAO), and P-against-Q (PAQ). A set of different sizes benchmark problems is used in order to analyze the prediction accuracy of the three multi-class implemented schemes and to compare the results to those obtained using other three standard classification algorithms.


Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Haykin S. Neural networks: a comprehensive foundation. NY: Macmillan/IEEE Press;1994.
Lawrence S, Giles CL, Tsoi AC. What size neural network gives optimal generalization ? Convergence properties of backpropagation. In: Technical report UMIACS-TR-96-22 and CS-TR-3617, Institute for Advanced Computer Studies, Univ. of Maryland. 1996.
Gómez I, Franco L, Jerez JM. Neural network architecture selection: Can function complexity help? Neural Process Lett. 2009;30:71–87.
Franco L, Elizondo D, Jerez JM, editors. Constructive neural networks. Berlin: Springer; 2010.
Smieja FJ. Neural network constructive algorithms: trading generalization for learning efficiency? Circuits Syst Signal Process. 1993;12:331–74.
do Carmo Nicoletti MC, Bertini JR. An empirical evaluation of constructive neural network algorithms in classification tasks. Int J Innov Comput Appl. 2007;1:2–13.
Mezard M, Nadal JP. Learning in feedforward layered networks: the tiling algorithm. J Phys A. 1989;22:2191–204.
Frean M. The upstart algorithm: a method for constructing and training feedforward neural networks. Neural Comput. 1990;2:198–209.
Parekh R, Yang J, Honavar V. Constructive neural-network learning algorithms for pattern classification. IEEE Trans Neural Netw. 2000;11:436–51.
Subirats JL, Jerez JM, Franco L. A new decomposition algorithm for threshold synthesis and generalization of Boolean functions. IEEE Trans Circuits Syst I. 2008;55:3188–96.
Ou G, Murphey YL. Multi-class pattern classification using neural networks. Pattern Recognit. 2007;40:418.
Subirats JL, Jerez JM, Franco L. A local stable learning rule and global competition for generating compact neural network architectures with good generalization abilities: the C-Mantec algorithm. Submitted. 2010.
Hawkins DM. The problem of overfitting. J Chem Info Comput Sci. 2004; 44:1–12.
Prechelt L. Proben 1—A set of benchmarks and benchmarking rules for neural network training algorithms. Technical Report. 1994.
Acknowledgements
The authors acknowledge support from MICIIN (Spain) through grant TIN2008-04985 (including FEDER funds) and from Junta de Andalucía through grants P06-TIC-01615 and P08-TIC-04026.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Subirats, J.L., Jerez, J.M., Gómez, I. et al. Multiclass Pattern Recognition Extension for the New C-Mantec Constructive Neural Network Algorithm. Cogn Comput 2, 285–290 (2010). https://doi.org/10.1007/s12559-010-9051-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-010-9051-6