Abstract
An ensemble method should cleverly combine a group of base classifiers to yield an improved classifier. The majority vote is an example of a methodology used to combine classifiers in an ensemble method. In this paper, we propose to combine classifiers using an associative memory model. Precisely, we introduce ensemble methods based on recurrent correlation associative memories (RCAMs) for binary classification problems. We show that an RCAM-based ensemble classifier can be viewed as a majority vote classifier whose weights depend on the similarity between the base classifiers and the resulting ensemble method. More precisely, the RCAM-based ensemble combines the classifiers using a recurrent consult and vote scheme. Furthermore, computational experiments confirm the potential application of the RCAM-based ensemble method for binary classification problems.
This work was supported in part by CNPq under grant no. 310118/2017-4, FAPESP under grant no. 2019/02278-2, and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Austin, J.: ADAM: a distributed associative memory for scene analysis. In: Proceedings of the IEEE First International Conference on Neural Networks, vol. IV, p. 285. San Diego (1987)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996). https://doi.org/10.1023/A:1018054314350
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001). https://doi.org/10.1023/A:1010933404324
Burda, M.: Paircompviz: An R Package for Visualization of Multiple Pairwise Comparison Test Results (2013). https://doi.org/10.18129/B9.bioc.paircompviz
Chiueh, T., Goodman, R.: Recurrent correlation associative memories. IEEE Trans. Neural Netw. 2, 275–284 (1991)
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)
Ferreira, A., Figueiredo, M.: Boosting algorithms: a review of methods, theory, and applications. In: Zhang, C., Ma, Y. (eds.) Ensemble Machine Learning: Methods and Applications, pp. 35–85. Springer (2012). https://doi.org/10.1007/978-1-4419-9326-7_2
García, C., Moreno, J.A.: The hopfield associative memory network: improving performance with the kernel “Trick”. In: Lemaître, C., Reyes, C.A., González, J.A. (eds.) IBERAMIA 2004. LNCS (LNAI), vol. 3315, pp. 871–880. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30498-2_87
García, C., Moreno, J.A.: The kernel hopfield memory network. In: Sloot, P.M.A., Chopard, B., Hoekstra, A.G. (eds.) ACRI 2004. LNCS, vol. 3305, pp. 755–764. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30479-1_78
Géron, A.: Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. O’Reilly Media (2019)
Hancock, E.R., Pelillo, M.: A Bayesian interpretation for the exponential correlation associative memory. Pattern Recogn. Lett. 19(2), 149–159 (1998)
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)
Du, K.-L., Swamy, M.N.S.: Associative Memory Networks. Neural Networks and Statistical Learning. LNCS, pp. 201–229. Springer, London (2019). https://doi.org/10.1007/978-1-4471-7452-3_8
Ho, T.K.: The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)
Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. 79, 2554–2558 (1982)
Hopfield, J., Tank, D.: Neural computation of decisions in optimization problems. Biol. Cybern. 52, 141–152 (1985)
Jankowski, S., Lozowski, A., Zurada, J.: Complex-valued multi-state neural associative memory. IEEE Trans. Neural Netw. 7, 1491–1496 (1996)
Kanter, I., Sompolinsky, H.: Associative recall of memory without errors. Phys. Rev. 35, 380–392 (1987)
Kittler, J., Roli, F.: 2000 Proceedings of the Multiple Classifier Systems: First International Workshop, MCS 2000, Cagliari, Italy, June 21–23. Springer (2003)
Kobayashi, M.: Quaternionic Hopfield neural networks with twin-multistate activation function. Neurocomputing 267, 304–310 (2017). https://doi.org/10.1016/j.neucom.2017.06.013
Kohonen, T.: Self-Organization and Associative Memory, 2rd edn. Springer, New York (1987)
Kultur, Y., Turhan, B., Bener, A.: Ensemble of neural networks with associative memory (ENNA) for estimating software development costs. Knowl.-Based Syst. 22(6), 395–402 (2009)
Kuncheva, L.: Combining Pattern Classifiers: Methods and Algorithms, 2 edn. Wiley (2014)
McEliece, R.J., Posner, E.C., Rodemich, E.R., Venkatesh, S.: The capacity of the Hopfield associative memory. IEEE Trans. Inf. Theory 1, 33–45 (1987)
Minemoto, T., Isokawa, T., Nishimura, H., Matsui, N.: Quaternionic multistate Hopfield neural network with extended projection rule. Artif. Life Robot. 21(1), 106–111 (2015). https://doi.org/10.1007/s10015-015-0247-4
Müezzinoǧlu, M., Güzeliş, C., Zurada, J.: A new design method for the complex-valued multistate Hopfield associative memory. IEEE Trans. Neural Netw. 14(4), 891–899 (2003)
Müezzinoǧlu, M., Güzelis, C., Zurada, J.: An energy function-based design method for discrete Hopfield associative memory with attractive fixed points. IEEE Trans. Neural Netw. 16(2), 370–378 (2005)
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Perfetti, R., Ricci, E.: Recurrent correlation associative memories: a feature space perspective. IEEE Trans. Neural Netw. 19(2), 333–345 (2008)
Polikar, R.: Ensemble learning. In: Zhang, C., Ma, Y. (eds.) Ensemble Machine Learning: Methods and Applications, pp. 1–34. Springer (2012). https://doi.org/10.1007/978-1-4419-9326-7_1
Ponti Jr, M.P.: Combining classifiers: from the creation of ensembles to the decision fusion. In: 2011 24th SIBGRAPI Conference on Graphics, Patterns, and Images Tutorials, pp. 1–10. IEEE (2011)
Serpen, G.: Hopfield network as static optimizer: learning the weights and eliminating the guesswork. Neural Process. Lett. 27(1), 1–15 (2008). https://doi.org/10.1007/s11063-007-9055-8
Smith, K., Palaniswami, M., Krishnamoorthy, M.: Neural techniques for combinatorial optimization with applications. IEEE Trans. Neural Netw. 9(6), 1301–1318 (1998)
Sun, Y.: Hopfield neural network based algorithms for image restoration and reconstruction II. Perform. Anal. IEEE Trans. Sign. Process. 48(7), 2119–2131 (2000). https://doi.org/10.1109/78.847795
Van Erp, M., Vuurpijl, L., Schomaker, L.: An overview and comparison of voting methods for pattern recognition. In: Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition, pp. 195–200. IEEE (2002)
Vanschoren, J., van Rijn, J.N., Bischl, B., Torgo, L.: OpenML: networked science in machine learning. SIGKDD Explor. 15(2), 49–60 (2013). https://doi.org/10.1145/2641190.2641198
Weise, T., Chiong, R.: An alternative way of presenting statistical test results when evaluating the performance of stochastic approaches. Neurocomputing 147, 235–238 (2015). https://doi.org/10.1016/j.neucom.2014.06.071
Zhang, C., Ma, Y. (eds.): Ensemble Machine Learning: Methods and Applications. Springer (2012). https://doi.org/10.1007/978-1-4419-9326-7
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Lobo, R.A., Valle, M.E. (2020). Ensemble of Binary Classifiers Combined Using Recurrent Correlation Associative Memories. In: Cerri, R., Prati, R.C. (eds) Intelligent Systems. BRACIS 2020. Lecture Notes in Computer Science(), vol 12320. Springer, Cham. https://doi.org/10.1007/978-3-030-61380-8_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-61380-8_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-61379-2
Online ISBN: 978-3-030-61380-8
eBook Packages: Computer ScienceComputer Science (R0)