An integrated particle swarm optimization approach hybridizing a new self-adaptive particle swarm optimization with a modified differential evolution | Neural Computing and Applications Skip to main content
Log in

An integrated particle swarm optimization approach hybridizing a new self-adaptive particle swarm optimization with a modified differential evolution

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Hybridizing particle swarm optimization (PSO) with differential evolution (DE), this paper proposes an integrated PSO–DE optimizer and examines the performance of this optimizer. Firstly, a new self-adaptive PSO (SAPSO) is established to guide movements of particles in the proposed hybrid PSO. Aiming at well trade-offing the global and local search capabilities, a self-adaptive strategy is proposed to adaptively update the three main control parameters of particles in SAPSO. Since the performance of PSO heavily relies on its convergence, the convergence of SAPSO is analytically investigated and a convergence-guaranteed parameter selection rule is provided for SAPSO in this study. Subsequently, a modified self-adaptive differential evolution is presented to evolve the personal best positions of particles in the proposed hybrid PSO in order to mitigant the potential stagnation issue. Next, the performance of the proposed method is validated via 25 benchmark test functions and two real-world problems. The simulation results confirm that the proposed method performs significantly better than its peers at a confidence level of 95% over the 25 benchmarks in terms of the solution optimality. Besides, the proposed method outperforms its contenders over the majority of the 25 benchmarks with respect to the search reliability and the convergence speed. Moreover, the computational complexity of the proposed method is comparable with those of some other enhanced PSO–DE methods compared. The simulation results over the two real-world issues reveal that the proposed method dominates its competitors as far as the solution optimality is considered.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Leung AYT, Zhang H, Cheng CC, Lee YY (2008) Particle swarm optimization of TMD by non-stationary base excitation during earthquake. Earthq Eng Struct Dyn 37:1223–1246

    Article  Google Scholar 

  2. Leung AYT, Zhang H (2009) Particle swarm optimization of tuned mass dampers. Eng Struct 31:715–728

    Article  Google Scholar 

  3. Zhang H, Llorca J, Davis CC, Milner SD (2012) Control and optimization in heterogeneous wireless networks. IEEE Trans Mob Comput 11(7):1207–1222

    Article  Google Scholar 

  4. Yadav N, Yadav A, Kumar M, Kim JH (2017) An efficient algorithm based on artificial neural networks and particle swarm optimization for solution of nonlinear Troesch’s problem. Neural Comput Appl 28(1):171–178

    Article  Google Scholar 

  5. Hasanipanah M, Armaghani DJ, Amnieh HB, Majid MZA, Tahir MMD (2017) Application of PSO to develop a powerful equation for prediction of flyrock due to blasting. Neural Comput Appl 28(s1):1043–1050

    Article  Google Scholar 

  6. Zhang YC, Xiong X, Zhang QD (2013) An Improved self-adaptive PSO algorithm with detection function for multimodal function optimization problems. Math Probl Eng 2013:716952

    MathSciNet  MATH  Google Scholar 

  7. Ratnaweera A, Halgamuge SK, Watson HC (2004) Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Trans Evol Comput 8(3):240–255

    Article  Google Scholar 

  8. Nickabadi A, Ebadzadeh MM, Safabakhsh R (2011) A novel particle swarm optimization algorithm with adaptive inertia weight. Appl Soft Comput 11(4):3658–3670

    Article  Google Scholar 

  9. Lim WH, Isa NMA (2014) An adaptive two-layer particle swarm optimization with elitist learning strategy. Inf Sci (Ny) 273:49–72

    Article  MathSciNet  Google Scholar 

  10. Akbari R, Ziarati K (2011) A rank based particle swarm optimization algorithm with dynamic adaptation. J Comput Appl Math 235(8):2694–2714

    Article  MathSciNet  Google Scholar 

  11. Roh JH, Kim MJ, Song HY, Park JB, Lee SU, Son SY (2013) An improved mean-variance optimization for nonconvex economic dispatch problems. J Electr Eng Technol 8(1):80–89

    Article  Google Scholar 

  12. Leboucher C, Shin HS, Siarry P, Le Ménec S, Chelouah R, Tsourdos A (2016) Convergence proof of an enhanced particle swarm optimisation method integrated with evolutionary game theory. Inf Sci (Ny) 346–347:389–411

    Article  Google Scholar 

  13. Tang B, Zhu Z, Luo J (2016) A framework for constrained optimization problems based on a modified particle swarm optimization. Math Probl Eng 2016:8627083

    MathSciNet  MATH  Google Scholar 

  14. Clerc M (2012) Standard particle swarm optimisation. https://hal.archives-ouvertes.fr/hal-00764996. Accessed 13 Dec 2012

  15. Wang Y, Yang Y (2009) Particle swarm optimization with preference order ranking for multi-objective optimization. Inf Sci (Ny) 179(12):1944–1959

    Article  MathSciNet  Google Scholar 

  16. Epitropakis MG, Plagianakos VP, Vrahatis MN (2012) Evolving cognitive and social experience in particle swarm optimization through differential evolution: a hybrid approach. Inf Sci (Ny) 216:50–92

    Article  Google Scholar 

  17. Zheng YJ, Xu XL, Ling HF, Chen SY (2015) A hybrid fireworks optimization method with differential evolution operators. Neurocomputing 148:75–82

    Article  Google Scholar 

  18. Eberhart R, Kennedy J (1995) A new optimizer using particle swarm theory. In: MHS’95, Proceedings of the sixth international symposium on micro machine and human science, pp 39–43

  19. Salman A, Engelbrecht AP, Omran MGH (2007) Empirical analysis of self-adaptive differential evolution. Eur J Oper Res 183(2):785–804

    Article  Google Scholar 

  20. Chauhan P, Deep K, Pant M (2013) Novel inertia weight strategies for particle swarm optimization. Memet Comput 5(3):229–251

    Article  Google Scholar 

  21. Van Den Bergh F, Engelbrecht AP (2006) A study of particle swarm optimization particle trajectories. Inf Sci (Ny) 176(8):937–971

    Article  MathSciNet  Google Scholar 

  22. Tang B, Zhu Z, Luo J (2016) Hybridizing particle swarm optimization and differential evolution for the mobile robot global path planning. Int J Adv Robot Syst 13:1–17

    Article  Google Scholar 

  23. Lin Y-K, Chong CS (2015) Fast GA-based project scheduling for computing resources allocation in a cloud manufacturing system. J Intell Manuf 28:1189–1201

    Article  Google Scholar 

  24. Yu J, Wang C (2013) A max-min ant colony system for assembly sequence planning. Int J Adv Manuf Technol 67(9–12):2819–2835

    Article  Google Scholar 

  25. Zhang Y, Gong D, Sun X, Geng N (2014) Adaptive bare-bones particle swarm optimization algorithm and its convergence analysis. Soft Comput 18(7):1337–1352

    Article  Google Scholar 

  26. Blackwell T (2012) A study of collapse in bare bones particle swarm optimization. IEEE Trans Evol Comput 16(3):354–372

    Article  Google Scholar 

  27. Suganthan PN, Hansen N, Liang JJ, Deb K, Chen Y, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Nat Comput 1–50

  28. Zhang J, Zhou Y, Deng H (2013) Hybridizing particle swarm optimization with differential evolution based on feasibility rules. In: ICGIP 2012, vol 8768, p 876807

  29. Liu H, Cai Z, Wang Y (2010) Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl Soft Comput J 10(2):629–640

    Article  Google Scholar 

  30. Asafuddoula M, Ray T, Sarker R (2014) An adaptive hybrid differential evolution algorithm for single objective optimization. Appl Math Comput 231:601–618

    MathSciNet  MATH  Google Scholar 

  31. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors express our heartfelt thanks to the editor and all reviewers for their valuable suggestions to improve this work. The reader is welcomed to contact us through tbw198732@sina.com or https://github.com/Autumn0/PSO-simulation-codes for the reference codes regarding this work. Funding was provided by National Science Foundation of China (Grant No. 61603284).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kui Xiang.

Ethics declarations

Conflict of interest

The authors declare that we have no conflict of interest regarding this paper.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Appendices

Appendix A

The fitness curves of \(E_{\mathrm{mean}}\) of different methods for the 25 30-dimensional test functions are listed in this appendix (See Figs. 10, 11 and 12).

Fig. 10
figure 10figure 10

Convergence graphs of \(E_{\mathrm{mean}}\) of different PSO variants for functions \(F_1\)\(F_{10}\). a\(F_1\), b\(F_2\), c\(F_3\), d\(F_4\), e\(F_5\), f\(F_6\), g\(F_7\), h\(F_8\), i\(F_9\), j\(F_{10}\)

Fig. 11
figure 11figure 11

Convergence graphs of \(E_{\mathrm{mean}}\) of different PSO variants for functions \(F_{11}\)\(F_{18}\). a\(F_{11}\), b\(F_{12}\), c\(F_{13}\), d\(F_{14}\), e\(F_{15}\), f\(F_{16}\), g\(F_{17}\), h\(F_{18}\)

Fig. 12
figure 12figure 12

Convergence graphs of \(E_{\mathrm{mean}}\) of different PSO variants for functions \(F_{19}\)\(F_{25}\). a\(F_{19}\), b\(F_{20}\), c\(F_{21}\), d\(F_{22}\), e\(F_{23}\), f\(F_{24}\), g\(F_{25}\)

Appendix B

The fitness curves of \(E_{\mathrm{mean}}\) of different methods for the 8 50-dimensional test functions are listed in this appendix (Fig. 13).

Fig. 13
figure 13figure 13

Convergence graphs of \(E_{\mathrm{mean}}\) obtained by different methods for different 50-dimensional test functions. a\(F_{18}\), b\(F_{19}\), c\(F_{20}\), d\(F_{21}\), e\(F_{22}\), f\(F_{23}\), g\(F_{24}\), h\(F_{25}\)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tang, B., Xiang, K. & Pang, M. An integrated particle swarm optimization approach hybridizing a new self-adaptive particle swarm optimization with a modified differential evolution. Neural Comput & Applic 32, 4849–4883 (2020). https://doi.org/10.1007/s00521-018-3878-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-018-3878-2

Keywords

Navigation