Abstract
Cerebellar Model Articulation Controller (CMAC) has some attractive features: fast learning capability and the possibility of efficient digital hardware implementation. Besides these attractive features it has a serious drawback: its memory complexity may be very large. In multidimensional case this may be so large that practically it cannot be implemented. To reduce memory complexity several different approaches were suggested so far. Although these approaches may greatly reduce memory complexity we have to pay a price for this complexity reduction. Either both modelling and generalization capabilities are deteriorated, or the training process will be much more complicated. This paper proposes a new approach of complexity reduction, where properly constructed hash-coding is combined with regularized kernel representation. The proposed version exploits the benefits of kernel representation and the complexity reduction effect of hash-coding, while smoothing regularization helps to reduce the performance degradation.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Albus, J.S.: A New Approach to Manipulator Control: The Cerebellar Model Articulation Controller (CMAC). Transaction of the ASME, 220–227 (September 1975)
Thompson, D.E., Kwon, S.: Neighbourhood Sequential and Random Training Techniques for CMAC. IEEE Trans. on Neural Networks 6, 196–202 (1995)
Ker, J.S., Kuo, Y.H., Wen, R.C., Liu, B.D.: Hardware Implementation of CMAC Neural Network with Reduced Storage Requirement. IEEE Trans. on Neural Networks 8, 1545–1556 (1997)
Horváth, G., Deák, F.: Hardware Implementation of Neural Networks Using FPGA Elements. In: Proc. of The International Conference on Signal Processing Application and Technology, Santa Clara, vol. II, pp. 60–65 (1993)
Brown, M., Harris, C.J., Parks, P.C.: The Interpolation Capabilities of the Binary CMAC. Neural Networks 6(3), 429–440 (1993)
Szabó, T., Horváth, G.: Improving the Generalization Capability of the Binary CMAC. In: Proc. Int. Joint Conf. on Neural Networks, IJCNN 2000, Como, Italy, vol. 3, pp. 85–90 (2000)
Zhong, L., Zhongming, Z., Chongguang, Z.: The Unfavorable Effects of Hash Coding on CMAC Convergence and Compensatory Measure. In: IEEE International Conference on Intelligent Processing Systems, Beijing, China, pp. 419–422 (1997)
Wang, Z.Q., Schiano, J.L., Ginsberg, M.: Hash Coding in CMAC Neural Networks. In: Proc. of the IEEE International Conference on Neural Network, Washington, USA, vol. 3, pp. 1698–1703 (1996)
Li, C.K., Chiang, C.T.: Neural Networks Composed of Single-variable CMACs. In: Proc. of the 2004 IEEE International Conference on Systems, Man and Cybernetics, The Hague, The Netherlands, pp. 3482–3487 (2004)
Lee, H.M., Chen, C.M., Lu, Y.F.: A Self-Organizing HCMAC Neural-Network Classifier. IEEE Trans. on Neural Networks 14, 15–27 (2003)
Hung, S.L., Jan, J.C.: MS_CMAC Neural Network Learning Model in Structural Engineering. Journal of Computing in Civil Engineering, 1–11 (January 1999)
Horváth, G., Szabó, T.: Kernel CMAC with improved capability. IEEE Trans. Sys. Man Cybernet. B 37(1), 124–138 (2007)
Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)
Valyon, J., Horváth, G.: Selection methods for extended least squares support vector machines. Int. Journal of Intelligent Computing and Cybernetics 1(1), 69–93 (2008)
Lane, S.H., Handelman, D.A., Gelfand, J.J.: Theory and Development of Higher-Order CMAC Neural Networks. IEEE Control Systems Magazine 12(2), 23–30 (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Horváth, G., Gáti, K. (2009). Kernel CMAC with Reduced Memory Complexity. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5768. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04274-4_72
Download citation
DOI: https://doi.org/10.1007/978-3-642-04274-4_72
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04273-7
Online ISBN: 978-3-642-04274-4
eBook Packages: Computer ScienceComputer Science (R0)