Enhanced MWO Training Algorithm to Improve Classification Accuracy of Artificial Neural Networks | SpringerLink
Skip to main content

Enhanced MWO Training Algorithm to Improve Classification Accuracy of Artificial Neural Networks

  • Conference paper
Recent Advances on Soft Computing and Data Mining

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 287))

Abstract

The Mussels Wandering Optimization (MWO) algorithm is a novel meta-heuristic optimization algorithm inspired ecologically by mussels’ movement behavior. The MWO algorithm has been used to solve linear and nonlinear functions and it has been adapted in supervised training of Artificial Neural Networks (ANN). Based on the latter application, the classification accuracy of ANN based on MWO training was on par with other algorithms. This paper proposes an enhanced version of MWO algorithm; namely Enhanced-MWO (E-MWO) in order to achieve an improved classification accuracy of ANN. In addition, this paper discusses and analyses the MWO and the effect of MWO parameters selection (especially, the shape parameter) on ANN classification accuracy. The E-MWO algorithm is adapted in training ANN and tested using well-known benchmarking problems and compared against other algorithms. The obtained results indicate that the E-MWO algorithm is a competitive alternative to other evolutionary and gradient-descent based training algorithms in terms of classification accuracy and training time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 22879
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 28599
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Ghosh-Dastidar, S., Adeli, H.: A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection. J. Neural Networks 22, 1419–1431 (2009)

    Article  Google Scholar 

  2. Wang, C.H., Lin, S.F.: Toward a New Three Layer Neural Network with Dynamical Optimal Training Performance. In: Proceedings IEEE International Conference on Systems, Man and Cybernetics, pp. 3101–3106 (2007)

    Google Scholar 

  3. An, J., Kang, Q., Wang, L., Wu, Q.: Mussels Wandering Optimization: An Ecologically Inspired Algorithm for Global Optimization. Cognitive Computation 5(2), 188–199 (2013)

    Article  Google Scholar 

  4. Suraweera, N.P., Ranasinghe, D.N.: A Natural Algorithmic Approach to the Structural Optimisation of Neural Networks. In: Proceedings of 4th International Conference on Information and Automation for Sustainability, pp. 150–156 (2008)

    Google Scholar 

  5. Abusnaina, A.A., Abdullah, R.: Mussels Wandering Optimization Algorithm based training of Artificial Neural Networks for Pattern Classification. In: Proceedings of the 4th International Conference on Computing and Informatics, ICOCI 2013, pp. 78–85 (2013)

    Google Scholar 

  6. Seiffert, U.: Training of Large-Scale FeedForward Neural Networks. In: Proc. of International Joint Conference on Neural Networks, Vancouver, BC, Canada (2006)

    Google Scholar 

  7. Jiang, X., Wah, A.H.K.S.: Constructing and training feed-forwardneural networks for pattern classification. Pattern Recognition 36, 853–867 (2003)

    Article  Google Scholar 

  8. Baptista, D., Morgado-Dias, F.: A survey of artificial neural network training tools. Neural Computing and Applications 23(3-4), 609–615 (2013)

    Article  Google Scholar 

  9. Afshar, S., Mosleh, M., Kheyrandish, M.: Presenting anew multiclass classifier based on learning automata. J. Neurocomputing 104, 97–104 (2013)

    Article  Google Scholar 

  10. Zhang, G.P.: Neural Networks for Classification: A Survey. J. IEEE Transactions on Systems, Man, And Cybernetics–Part C: Applications And Reviews 30(4), 451–462 (2000)

    Article  Google Scholar 

  11. Wu, T.K., Hwang, S.C., Meng, Y.R.: Improving ANN Classification Accuracy for the Identification of Students with LDs through Evolutionary Computation. In: Proceedings of the 2007 IEEE Congress on Evolutionary Computation, pp. 4358–4364 (2007)

    Google Scholar 

  12. Silva, D.N.G., Pacifico, L.D.S., Ludermir, T.B.: An evolutionary extreme learning machine based on group search optimization. In: IEEE Congress of Evolutionary Computation, pp. 574–580 (2011)

    Google Scholar 

  13. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: A new learning scheme of feedforward neural networks. In: Proceedings of IEEE International Conference on Neural Networks, pp. 985–990 (2004)

    Google Scholar 

  14. Kattan, A., Abdullah, R.: Training of Feed-Forward Neural Networks for Pattern-Classification Applications Using Music Inspired Algorithm. International Journal of Computer Science and Information Security 9(11), 44–57 (2011)

    Google Scholar 

  15. Karaboga, D., Akay, B., Ozturk, C.: Artificial Bee Colony (ABC) Optimization Algorithm for Training Feed-Forward Neural Networks. In: Torra, V., Narukawa, Y., Yoshida, Y. (eds.) MDAI 2007. LNCS (LNAI), vol. 4617, pp. 318–329. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  16. Gao, Q., Lei, K.Q.Y., He, Z.: An Improved Genetic Algorithm and Its Application in Artificial Neural Network. In: Proceedings Fifth International Conference on Information, Communications and Signal Processing, pp. 357–360 (2005)

    Google Scholar 

  17. He, S., Wu, Q.H., Saunders, J.R.: Group Search Optimizer: An Optimization Algorithm Inspired by Animal Searching Behavior. J. IEEE Transactions on Evolutionary Computation 13(5), 973–990 (2009)

    Article  Google Scholar 

  18. He, S., Wu, Q.H., Saunders, J.R.: Breast cancer diagnosis using an artificial neural network trained by group search optimizer. J. Transactions of the Institute of Measurement and Control 31(6), 517–553 (2009)

    Article  Google Scholar 

  19. Su, T., Jhang, J., Hou, C.: A hybrid Artificial Neural Networks and Particle Swarm Optimization for function approximation. International Journal of Innovative Computing, Information and Control 4(9), 2363–2374 (2008)

    Google Scholar 

  20. Kattan, A., Abdullah, R.: Training Feed-Forward Artificial Neural Networks For Pattern-Classification Using The Harmony Search Algorithm. In: Proceedings the Second International Conference on Digital Enterprise and Information Systems (DEIS 2013), pp. 84–97 (2013)

    Google Scholar 

  21. Kattan, A., Abdullah, R., Salam, R.: Harmony Search Based Supervised Training of Artificial Neural Networks. In: Proceedings International Conference on Intelligent Systems, Modeling and Simulation, pp. 105–110 (2010)

    Google Scholar 

  22. Fish, K., Johnson, J., Dorsey, E., Blodgett, J.: Using an Artificial Neural Network Trained with a Genetic Algorithm to Model Brand Share. Journal of Business Research 57, 79–85 (2004)

    Article  Google Scholar 

  23. Zaharie, D.: Control of population diversity and adaptation in differential evolution algorithms. In: Matousek, R., Osmera, P. (eds.) Proceedings Mendel 9th International Conference Soft Computing, Brno, Czech Republic, pp. 41–46 (June 2003)

    Google Scholar 

  24. Gupta, D., Ghafir, S.: An Overview of methods maintaining Diversity in Genetic Algorithms. International Journal of Emerging Technology and Advanced Engineering 2(5), 56–60 (2012)

    Google Scholar 

  25. Frank, A., Asuncion, A.: UCI Machine Learning Repository. University of California, Irvine, School of Information and Computer Sciences, http://archive.ics.uci.edu/ml (online accessed on February 2013)

  26. Masmoudi, M.S., Klabi, I., Masmoudi, M.: Performances improvement of back propagation algorithm applied to a lane following system. In: Proceedings World Congress on Computer and Information Technology, pp. 1–5 (2013)

    Google Scholar 

  27. Dorsey, R.E., Johnson, J.D., Mayer, W.J.: A Genetic Algoirthm for the Training of Feedforward Neural Networks. In: Advances in Artificial Intelligence in Economics, Finance, and Management, vol. 1, pp. 93–111 (1994)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmed A. Abusnaina .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Abusnaina, A.A., Abdullah, R., Kattan, A. (2014). Enhanced MWO Training Algorithm to Improve Classification Accuracy of Artificial Neural Networks. In: Herawan, T., Ghazali, R., Deris, M. (eds) Recent Advances on Soft Computing and Data Mining. Advances in Intelligent Systems and Computing, vol 287. Springer, Cham. https://doi.org/10.1007/978-3-319-07692-8_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07692-8_18

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07691-1

  • Online ISBN: 978-3-319-07692-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics