Self-adjusting Population Sizes for the  $$(1, \lambda )$$ -EA on Monotone Functions | SpringerLink
Skip to main content

Self-adjusting Population Sizes for the \((1, \lambda )\)-EA on Monotone Functions

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVII (PPSN 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13399))

Included in the following conference series:

Abstract

We study the \((1,\lambda )\)-EA with mutation rate c/n for \(c\le 1\), where the population size is adaptively controlled with the \((1:s+1)\)-success rule. Recently, Hevia Fajardo and Sudholt have shown that this setup with \(c=1\) is efficient on OneMax for \(s<1\), but inefficient if \(s \ge 18\). Surprisingly, the hardest part is not close to the optimum, but rather at linear distance. We show that this behavior is not specific to OneMax. If s is small, then the algorithm is efficient on all monotone functions, and if s is large, then it needs superpolynomial time on all monotone functions. For small \(s\) and \(c<1\) we show a O(n) upper bound for the number of generations and \(O(n\log n)\) for the number of function evaluations; for small \(s\) and \(c=1\) we show the optimum is reached in \(O(n\log n)\) generations and \(O(n^2\log \log n)\) evaluations. We also show formally that optimization is always fast, regardless of s, if the algorithm starts in proximity of the optimum. All results also hold in a dynamic environment where the fitness function changes in each generation.

Extended Abstract. The proofs and further details are available on arxiv [32].

M. Kaufmann—Supported by the Swiss National Science Foundation [grant number 192079].

X. Zou—Supported by the Swiss National Science Foundation [grant number CR-SII5_173721].

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 14871
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 18589
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Aleti, A., Moser, I.: A systematic literature review of adaptive parameter control methods for evolutionary algorithms. ACM Comput. Surv. (CSUR) 49(3), 1–35 (2016)

    Article  Google Scholar 

  2. Antipov, D., Doerr, B., Yang, Q.: The efficiency threshold for the offspring population size of the \((\mu , \lambda )\) EA. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1461–1469 (2019)

    Google Scholar 

  3. Auger, A.: Benchmarking the (1+ 1) evolution strategy with one-fifth success rule on the BBOB-2009 function testbed. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 2447–2452 (2009)

    Google Scholar 

  4. Badkobeh, G., Lehre, P.K., Sudholt, D.: Unbiased black-box complexity of parallel search. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 892–901. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10762-2_88

    Chapter  Google Scholar 

  5. Böttcher, S., Doerr, B., Neumann, F.: Optimal Fixed and adaptive mutation rates for the leadingones problem. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 1–10. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_1

    Chapter  Google Scholar 

  6. Case, B., Lehre, P.K.: Self-adaptation in nonelitist evolutionary algorithms on discrete problems with unknown structure. IEEE Trans. Evol. Comput. 24(4), 650–663 (2020)

    Article  Google Scholar 

  7. Colin, S., Doerr, B., Férey, G.: Monotonic functions in EC: anything but monotone! In: Genetic and Evolutionary Computation Conference (GECCO), pp. 753–760 (2014)

    Google Scholar 

  8. Devroye, L.: The compound random search. Ph.D. dissertation, Purdue Univ., West Lafayette, IN (1972)

    Google Scholar 

  9. Doerr, B., Doerr, C.: Optimal static and self-adjusting parameter choices for the \((1+(\lambda,\lambda ))\) Genetic Algorithm. Algorithmica 80(5), 1658–1709 (2018)

    Article  MathSciNet  Google Scholar 

  10. Doerr, B., Doerr, C.: Theory of Parameter control for discrete black-box optimization: provable performance gains through dynamic parameter choices. In: Theory of Evolutionary Computation. NCS, pp. 271–321. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_6

    Chapter  MATH  Google Scholar 

  11. Doerr, B., Doerr, C., Ebel, F.: From black-box complexity to designing new genetic algorithms. Theoret. Comput. Sci. 567, 87–104 (2015)

    Article  MathSciNet  Google Scholar 

  12. Doerr, B., Doerr, C., Kötzing, T.: Provably optimal self-adjusting step sizes for multi-valued decision variables. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 782–791. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-45823-6_73

    Chapter  Google Scholar 

  13. Doerr, B., Doerr, C., Kötzing, T.: Static and self-adjusting mutation strengths for multi-valued decision variables. Algorithmica 80(5), 1732–1768 (2018)

    Article  MathSciNet  Google Scholar 

  14. Doerr, B., Doerr, C., Lengler, J.: Self-adjusting mutation rates with provably optimal success rules. Algorithmica 83(10), 3108–3147 (2021)

    Article  MathSciNet  Google Scholar 

  15. Doerr, B., Doerr, C., Yang, J.: Optimal parameter choices via precise black-box analysis. Theoret. Comput. Sci. 801, 1–34 (2020)

    Article  MathSciNet  Google Scholar 

  16. Doerr, B., Gießen, C., Witt, C., Yang, J.: The \((1+\lambda )\) evolutionary algorithm with self-adjusting mutation rate. Algorithmica 81(2), 593–631 (2019)

    Article  MathSciNet  Google Scholar 

  17. Doerr, B., Jansen, T., Sudholt, D., Winzen, C., Zarges, C.: Optimizing monotone functions can be difficult. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 42–51. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_5

    Chapter  Google Scholar 

  18. Doerr, B., Jansen, T., Sudholt, D., Winzen, C., Zarges, C.: Mutation rate matters even when optimizing monotonic functions. Evol. Comput. 21(1), 1–27 (2013)

    Article  Google Scholar 

  19. Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64, 673–697 (2012)

    Article  MathSciNet  Google Scholar 

  20. Doerr, B., Künnemann, M.: Optimizing linear functions with the (1+ \(\lambda \)) evolutionary algorithm-different asymptotic runtimes for different instances. Theoret. Comput. Sci. 561, 3–23 (2015)

    Article  MathSciNet  Google Scholar 

  21. Doerr, B., Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the runtime analysis of selection hyper-heuristics with adaptive learning periods. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1015–1022 (2018)

    Google Scholar 

  22. Doerr, B., Witt, C., Yang, J.: Runtime analysis for self-adaptive mutation rates. Algorithmica 83(4), 1012–1053 (2021)

    Article  MathSciNet  Google Scholar 

  23. Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3, 124–141 (1999)

    Article  Google Scholar 

  24. Grimmett, G.R., et al.: Percolation, vol. 321. Springer, Heidelberg (1999). https://doi.org/10.1007/978-3-662-03981-6

  25. Hevia Fajardo, M.A., Sudholt, D.: On the choice of the parameter control mechanism in the (1+(\(\lambda \), \(\lambda \))) genetic algorithm. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 832–840 (2020)

    Google Scholar 

  26. Hevia Fajardo, M.A., Sudholt, D.: Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. arXiv preprint arXiv:2104.05624 (2021)

  27. Hevia Fajardo, M.A., Sudholt, D.: Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1151–1159 (2021)

    Google Scholar 

  28. Jagerskupper, J., Storch, T.: When the plus strategy outperforms the comma strategy and when not. In: 2007 IEEE Symposium on Foundations of Computational Intelligence, pp. 25–32. IEEE (2007)

    Google Scholar 

  29. Jansen, T.: On the brittleness of evolutionary algorithms. In: Stephens, C.R., Toussaint, M., Whitley, D., Stadler, P.F. (eds.) FOGA 2007. LNCS, vol. 4436, pp. 54–69. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73482-6_4

    Chapter  MATH  Google Scholar 

  30. Karafotias, G., Hoogendoorn, M., Eiben, Á.E.: Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans. Evol. Comput. 19(2), 167–187 (2014)

    Article  Google Scholar 

  31. Kaufmann, M., Larcher, M., Lengler, J., Zou, X.: OneMax is not the easiest function for fitness improvements (2022). https://arxiv.org/abs/2204.07017

  32. Kaufmann, M., Larcher, M., Lengler, J., Zou, X.: Self-adjusting population sizes for the \((1, \lambda )\)-EA on monotone functions (2022). https://arxiv.org/abs/2204.00531

  33. Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J., Koumoutsakos, P.: Learning probability distributions in continuous evolutionary algorithms-a comparative review. Nat. Comput. 3(1), 77–112 (2004)

    Article  MathSciNet  Google Scholar 

  34. Kötzing, T.: Concentration of first hitting times under additive drift. Algorithmica 75(3), 490–506 (2016)

    Article  MathSciNet  Google Scholar 

  35. Lässig, J., Sudholt, D.: Adaptive population models for offspring populations and parallel evolutionary algorithms. In: Foundations of Genetic Algorithms (FOGA), pp. 181–192 (2011)

    Google Scholar 

  36. Lehre, P.K., Witt, C.: Black-box search by unbiased variation. Algorithmica 64, 623–642 (2012)

    Article  MathSciNet  Google Scholar 

  37. Lengler, J.: A general dichotomy of evolutionary algorithms on monotone functions. IEEE Trans. Evol. Comput. 24(6), 995–1009 (2019)

    Article  Google Scholar 

  38. Lengler, J.: Drift analysis. In: Theory of Evolutionary Computation. NCS, pp. 89–131. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_2

    Chapter  Google Scholar 

  39. Lengler, J., Martinsson, A., Steger, A.: When does hillclimbing fail on monotone functions: an entropy compression argument. In: Analytic Algorithmics and Combinatorics (ANALCO), pp. 94–102. SIAM (2019)

    Google Scholar 

  40. Lengler, J., Meier, J.: Large population sizes and crossover help in dynamic environments. In: Bäck, T., et al. (eds.) PPSN 2020. LNCS, vol. 12269, pp. 610–622. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_42

    Chapter  Google Scholar 

  41. Lengler, J., Riedi, S.: Runtime analysis of the \((\mu + 1)\)-EA on the dynamic BinVal function. In: Zarges, C., Verel, S. (eds.) EvoCOP 2021. LNCS, vol. 12692, pp. 84–99. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72904-2_6

    Chapter  Google Scholar 

  42. Lengler, J., Schaller, U.: The \((1+1)\)-EA on noisy linear functions with random positive weights. In: Symposium Series on Computational Intelligence (SSCI), pp. 712–719. IEEE (2018)

    Google Scholar 

  43. Lengler, J., Steger, A.: Drift analysis and evolutionary algorithms revisited. Comb. Probab. Comput. 27(4), 643–666 (2018)

    Article  MathSciNet  Google Scholar 

  44. Lengler, J., Zou, X.: Exponential slowdown for larger populations: the \((\mu + 1)\)-EA on monotone functions. Theoret. Comput. Sci. 875, 28–51 (2021)

    Article  MathSciNet  Google Scholar 

  45. Lissovoi, A., Oliveto, P., Warwicker, J.A.: How the duration of the learning period affects the performance of random gradient selection hyper-heuristics. In: AAAI Conference on Artificial Intelligence (AAAI), vol. 34, no. 3, pp. 2376–2383 (2020)

    Google Scholar 

  46. Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: On the time complexity of algorithm selection hyper-heuristics for multimodal optimisation. In: AAAI Conference on Artificial Intelligence (AAAI), vol. 33, no. 1, pp. 2322–2329 (2019)

    Google Scholar 

  47. Lissovoi, A., Oliveto, P.S., Warwicker, J.A.: Simple hyper-heuristics control the neighbourhood size of randomised local search optimally for LeadingOnes. Evol. Comput. 28(3), 437–461 (2020)

    Article  Google Scholar 

  48. Mambrini, A., Sudholt, D.: Design and analysis of schemes for adapting migration intervals in parallel evolutionary algorithms. Evol. Comput. 23(4), 559–582 (2015)

    Article  Google Scholar 

  49. Rajabi, A., Witt, C.: Evolutionary algorithms with self-adjusting asymmetric mutation. In: Bäck, T., et al. (eds.) PPSN 2020. LNCS, vol. 12269, pp. 664–677. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_46

    Chapter  Google Scholar 

  50. Rajabi, A., Witt, C.: Self-adjusting evolutionary algorithms for multimodal optimization. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1314–1322 (2020)

    Google Scholar 

  51. Rechenberg, I.: Evolutionsstrategien. In: Schneider, B., Ranft, U. (eds.) Simulationsmethoden in der Medizin und Biologie, pp. 83–114. Springer, Heidelberg (1978). https://doi.org/10.1007/978-3-642-81283-5_8

    Chapter  Google Scholar 

  52. Rodionova, A., Antonov, K., Buzdalova, A., Doerr, C.: Offspring population size matters when comparing evolutionary algorithms with self-adjusting mutation rates. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 855–863 (2019)

    Google Scholar 

  53. Rowe, J.E., Sudholt, D.: The choice of the offspring population size in the (1, \(\lambda \)) evolutionary algorithm. Theoret. Comput. Sci. 545, 20–38 (2014)

    Article  MathSciNet  Google Scholar 

  54. Schumer, M., Steiglitz, K.: Adaptive step size random search. IEEE Trans. Autom. Control 13(3), 270–276 (1968)

    Article  Google Scholar 

Download references

Acknowledgements

We thank Dirk Sudholt for helpful discussions during the Dagstuhl seminar 22081 “Theory of Randomized Optimization Heuristics”. We also thank the reviewers for their helpful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xun Zou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kaufmann, M., Larcher, M., Lengler, J., Zou, X. (2022). Self-adjusting Population Sizes for the \((1, \lambda )\)-EA on Monotone Functions. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13399. Springer, Cham. https://doi.org/10.1007/978-3-031-14721-0_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14721-0_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14720-3

  • Online ISBN: 978-3-031-14721-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics