Abstract
Computational and technological advancements have led to an increase in data generation and storage capacity. Many annotated datasets have been used to train machine learning models for predictive tasks. Feature selection (FS) is a combinatorial binary optimization problem that arises from a need to reduce dataset dimensionality by finding the subset of features with maximum predictive accuracy. While different methodologies have been proposed, metaheuristics adapted to binary optimization have proven to be reliable and efficient techniques for FS. This paper applies the first and unique population-trajectory metaheuristic, the Lichtenberg algorithm (LA), and enhances it with a Fibonacci sequence to improve its exploration capabilities in FS. Substituting the random scales that controls the Lichtenberg figures' size and the population distribution in the original version by a sequence based on the golden ratio, a new optimal exploration–exploitation LF's size decay is presented. The new few hyperparameters golden Lichtenberg algorithm (GLA), LA, and eight other popular metaheuristics are then equipped with the v-shaped transfer function and associated with the K-nearest neighbor classifier in the search of the optimized feature subsets through a double cross-validation experiment method on 15 UCI machine learning repository datasets. The binary GLA selected reduced subsets of features, leading to the best predictive accuracy and fitness values at the lowest computational cost.







Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The datasets used are available in the UCI machine learning repository, the link is referred as https://archive.ics.uci.edu/ml/index.php. The MATLAB code of the Lichtenberg algorithm can be found at https://www.mathworks.com/matlabcentral/fileexchange/84732-lichtenberg-algorithm-la.
Abbreviations
- DM:
-
Data mining
- ML:
-
Machine learning
- FS:
-
Feature selection
- NFL:
-
No-free-lunch theorem
- LA:
-
Lichtenberg algorithm
- GLA:
-
Golden Lichtenberg algorithm
- DLA:
-
Diffusion-limited aggregation
- LF:
-
Lichtenberg figure
- S :
-
Stickiness factor
- N p :
-
Particle's number
- R c :
-
Creation radius
- Ref :
-
Refinement
- M :
-
Figure switching factor
- Pop :
-
Population
- N iter :
-
Iteration's Number
- KNN:
-
K-nearest neighborhood
- BLA:
-
Binary Lichtenberg algorithm
- MH:
-
Metaheuristics
- PSO:
-
Particle swarm optimization
- DE:
-
Differential evolution
- GA:
-
Genetic algorithm
- MBO:
-
Monarch butterfly optimization
- SSA:
-
Salp swarm algorithm
- WOA:
-
Whale optimization algorithm
- HHO:
-
Harris Hawks optimization
- MRFO:
-
Manta ray foraging optimization
References
Tubishat M, Idris N, Shuib L, Abushariah MA, Mirjalili S (2020) Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Syst Appl 145:113122
Arora S, Anand P (2019) Binary butterfly optimization approaches for feature selection. Expert Syst Appl 116:147–160
Ma BJ, Liu S, Heidari AA (2022) Multi-strategy ensemble binary hunger games search for feature selection. Knowl-Based Syst 248:108787
Pereira JLJ, Ma BJ, Francisco MB, Junior RFR, Gomes GF (2023) A comparison between chaos theory and Lévy flights in sunflower optimization for feature selection. Expert Syst 40(8):e13330
Tubishat M, Ja’afarr S, Alswaitti M, Mirjalili S, Idris N, Ismail MA, Omar MS (2021) Dynamic salp swarm algorithm for feature selection. Expert Syst Appl 164:113873
Xie J, Sage M, Zhao YF (2023) Feature selection and feature learning in machine learning applications for gas turbines: A review. Eng Appl Artif Intell 117:105591
Chantar H, Mafarja M, Alsawalqah H, Heidari AA, Aljarah I, Faris H (2020) Feature selection using binary grey wolf optimizer with elite-based crossover for Arabic text classification. Neural Comput Appl 32:12201–12220
Sheikhpour R, Berahmand K, Forouzandeh S (2023) Hessian-based semi-supervised feature selection using generalized uncorrelated constraint. Knowl-Based Syst 269:110521
Sharma M, Kaur P (2021) A comprehensive analysis of nature-inspired meta-heuristic techniques for feature selection problem. Archives of Computational Methods in Engineering 28:1103–1127
Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381
Neggaz N, Houssein EH, Hussain K (2020) An efficient henry gas solubility optimization for feature selection. Expert Syst Appl 152:113364
Alazzam H, Sharieh A, Sabri KE (2020) A feature selection algorithm for intrusion detection system based on pigeon inspired optimizer. Expert Syst Appl 148:113249
Zhang Y, Liu R, Wang X, Chen H, Li C (2021) Boosted binary Harris hawks optimizer and feature selection. Engineering with Computers 37:3741–3770
Song XF, Zhang Y, Guo YN, Sun XY, Wang YL (2020) Variable-size cooperativecoevolutionaryy particle swarm optimization for feature selection on high-dimensional data. IEEE Trans Evol Comput 24(5):882–895
Xue B, Zhang M, Browne WN (2012) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE transactions on cybernetics 43(6):1656–1671
Dhiman G, Oliva D, Kaur A, Singh KK, Vimal S, Sharma A, Cengiz K (2021) BEPO: A novel binary emperor penguin optimizer for automatic feature selection. Knowl-Based Syst 211:106560
Paniri M, Dowlatshahi MB, Nezamabadi-Pour H (2020) MLACO: A multi-label feature selection algorithm based on ant colony optimization. Knowl-Based Syst 192:105285
Hu P, Pan JS, Chu SC (2020) Improved binary grey wolf optimizer and its application for feature selection. Knowl-Based Syst 195:105746
Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S (2018) Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl-Based Syst 161:185–204
Hammouri AI, Mafarja M, Al-Betar MA, Awadallah MA, Abu-Doush I (2020) An improved dragonfly algorithm for feature selection. Knowl-Based Syst 203:106131
de Souza RCT, de Macedo CA, dos Santos Coelho L, Pierezan J, Mariani VC (2020) Binary coyote optimization algorithm for feature selection. Pattern Recogn 107:107470
Taradeh M, Mafarja M, Heidari AA, Faris H, Aljarah I, Mirjalili S, Fujita H (2019) An evolutionary gravitational search-based feature selection. Inf Sci 497:219–239
Neggaz N, Ewees AA, Abd Elaziz M, Mafarja M (2020) Boosting salp swarm algorithm by sine cosine algorithm and disrupt operator for feature selection. Expert Syst Appl 145:113103
Kwakye BD, Li Y, Mohamed HH, Baidoo E, Asenso TQ (2024) Particle guided metaheuristic algorithm for global optimization and feature selection problems. Expert Syst Appl 248:123362
Gaugel S, Reichert M (2024) Data-driven multi-objective optimization of hydraulic pump test cycles via wrapper feature selection. CIRP J Manuf Sci Technol 50:14–25
Tijjani S, Ab Wahab MN, Noor MHM (2024) An enhanced particle swarm optimization with position update for optimal feature selection. Expert Syst Appl 247:123337
Abd Elaziz M, Mirjalili S (2019) A hyper-heuristic for improving the initial population of whale optimization algorithm. Knowl-Based Syst 172:42–63
Yang XS (2020) Nature-inspired optimization algorithms: Challenges and open problems. Journal of Computational Science 46:101104
Yang XS (2020) Nature-inspired optimization algorithms. Academic Press
Ho YC, Pepyne DL (2002) Simple explanation of the no-free-lunch theorem and its implications. J Optim Theory Appl 115:549–570
Pereira JLJ, Francisco MB, Diniz CA, Oliver GA, Cunha SS Jr, Gomes GF (2021) Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst Appl 170:114522
Witten TA Jr, Sander LM (1981) Diffusion-limited aggregation, a kinetic critical phenomenon. Phys Rev Lett 47(19):1400
Witten TA, Sander LM (1983) Diffusion-limited aggregation. Phys Rev B 27(9):5686
Pereira JLJ, Chuman M, Cunha SS Jr, Gomes GF (2021) Lichtenberg optimization algorithm applied to crack tip identification in thin plate-like structures. Eng Comput 38(1):151–166
Pereira JLJ, Francisco MB, da Cunha Jr SS, Gomes GF (2021) A powerful Lichtenberg Optimization Algorithm: A damage identification case study. Eng Appl Artif Intell 97:104055
Francisco MB, Junqueira DM, Oliver GA, Pereira JLJ, da Cunha Jr Jr, S.S. and Gomes, G.F. (2021) Design optimizations of carbon fibre reinforced polymer isogrid lower limb prosthesis using particle swarm optimization and Lichtenberg algorithm. Eng Optim 53(11):1922–1945
Francisco M, Roque L, Pereira J, Machado S, da Cunha Jr SS, Gomes GF (2021) A statistical analysis of high-performance prosthetic isogrid composite tubes using response surface method. Eng Comput 38(6):2481–2504
Pereira JLJ, Francisco MB, Ribeiro RF, Cunha SS, Gomes GF (2022) Deep multi-objective design optimization of CFRP isogrid tubes using lichtenberg algorithm. Soft Comput 26(15):7195–7209
Pereira JLJ, Oliver GA, Francisco MB, Cunha SS Jr, Gomes GF (2022) Multi-objective lichtenberg algorithm: A hybrid physics-based meta-heuristic for solving engineering problems. Expert Syst Appl 187:115939
De Souza TAZ, Pereira JLJ, Francisco MB, Sotomonte CAR, Jun Ma B, Gomes GF, Coronado CJR (2023) Multi-objective optimization for methane, glycerol, and ethanol steam reforming using lichtenberg algorithm. Int J Green Energy 20(4):390–407
Challan M, Jeet S, Bagal DK, Mishra L, Pattanaik AK, Barua A (2022) Fabrication and mechanical characterization of red mud based Al2025-T6 MMC using Lichtenberg optimization algorithm and Whale optimization algorithm. Materials Today: Proceedings 50:1346–1353
Mohanty A, Nag KS, Bagal DK, Barua A, Jeet S, Mahapatra SS, Cherkia H (2022) Parametric optimization of parameters affecting dimension precision of FDM printed part using hybrid Taguchi-MARCOS-nature inspired heuristic optimization technique. Materials Today: Proceedings 50:893–903
Tian Z, Wang J (2022) Variable frequency wind speed trend prediction system based on combined neural network and improved multi-objective optimization algorithm. Energy 254:124249
Pereira JLJ, Francisco MB, de Oliveira LA, Chaves JAS, Cunha SS Jr, Gomes GF (2022) Multi-objective sensor placement optimization of helicopter rotor blade based on Feature Selection. Mech Syst Signal Process 180:109466
Horadam AF (1961) A generalized Fibonacci sequence. Am Math Mon 68(5):455–459
Kiefer J (1953) Sequential minimax search for a maximum. Proceedings of the American mathematical society 4(3):502–506
Keshavarz-Ghorbani F, Pasandideh SHR (2021) Optimizing a two-level closed-loop supply chain under the vendor managed inventory contract and learning: Fibonacci, GA, IWO, MFO algorithms. Neural Comput Appl 33:9425–9450
Horla D, Sadalla T (2020) Optimal tuning of fractional-order controllers based on Fibonacci-search method. ISA Trans 104:287–298
Nematollahi AF, Rahiminejad A, Vahidi B (2020) A novel meta-heuristic optimization method based on golden ratio in nature. Soft Comput 24(2):1117–1151
Etminaniesfahani A, Ghanbarzadeh A, Marashi Z (2018) Fibonacci indicator algorithm: A novel tool for complex optimization problems. Eng Appl Artif Intell 74:1–9
Yuan P, Zhang T, Yao L, Lu Y, Zhuang W (2022) A hybrid golden jackal optimization and golden sine algorithm with dynamic lens-imaging learning for global optimization problems. Appl Sci 12(19):9709
Etminaniesfahani A, Gu H, Salehipour A (2022) ABFIA: A hybrid algorithm based on artificial bee colony and Fibonacci indicator algorithm. Journal of Computational Science 61:101651
Sahoo, S.K., Reang, S., Saha, A.K. and Chakraborty, S., 2024. F-WOA: an improved whale optimization algorithm based on Fibonacci search principle for global optimization. In Handbook of Whale Optimization Algorithm Academic Press. 217–233
Sahoo SK, Houssein EH, Premkumar M, Saha AK, Emam MM (2023) Self-adaptive moth flame optimizer combined with crossover operator and Fibonacci search strategy for COVID-19 CT image segmentation. Expert Syst Appl 227:120367
Hartono N, Pham DT (2024) A novel Fibonacci-inspired enhancement of the Bees Algorithm: application to robotic disassembly sequence planning. Cogent Engineering 11(1):2298764
Mukherjee, D.S. and Yeri, N.G., 2021, December. Investigation of weight initialization using Fibonacci Sequence on the performance of neural networks. In 2021 IEEE Pune Section International Conference (PuneCon) (pp. 1–8). IEEE.
Garain A, Ray B, Giampaolo F, Velasquez JD, Singh PK, Sarkar R (2022) GRaNN: feature selection with golden ratio-aided neural network for emotion, gender and speaker identification from voice signals. Neural Comput Appl 34(17):14463–14486
Dincer S, Ulutas G, Ustubioglu B, Tahaoglu G, Sklavos N (2024) Golden ratio based deep fake video detection system with fusion of capsule networks. Comput Electr Eng 117:109234
Asuncion, A. and Newman, D., 2007. UCI machine learning repository.
Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27:1053–1073
Altman NS (1992) An introduction to kernel and nearest-neighbor nonparametric regression. Am Stat 46(3):175–185
Tahir MA, Bouridane A, Kurugollu F (2007) Simultaneous feature selection and feature weighting using Hybrid Tabu Search/K-nearest neighbor classifier. Pattern Recogn Lett 28(4):438–446
Sayed GI, Khoriba G, Haggag MH (2018) A novel chaotic salp swarm algorithm for global optimization and feature selection. Appl Intell 48:3462–3481
Mafarja M, andMirjalili. MS (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453
Alweshah, M., Khalaileh, S.A., Gupta, B.B., Almomani, A., Hammouri, A.I. and Al-Betar, M.A., 2022. The monarch butterfly optimization algorithm for solving feature selection problems. Neural Computing and Applications, pp.1–15.
Dokeroglu T, Deniz A, Kiziloz HE (2022) A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing 494:269–296
Wang M, Wu C, Wang L, Xiang D, Huang X (2019) A feature selection approach for hyperspectral image based on modified ant lion optimizer. Knowl-Based Syst 168:39–48
Cruz-Duarte JM, Amaya I, Ortiz-Bayliss JC, Conant-Pablos SE, Terashima-Marín H, Shi Y (2021) Hyper-heuristics to customise metaheuristics for continuous optimisation. Swarm Evol Comput 66:100935
Rao H, Shi X, Rodrigue AK, Feng J, Xia Y, Elhoseny M, Yuan X, Gu L (2019) Feature selection based on artificial bee colony and gradient boosting decision tree. Appl Soft Comput 74:634–642
Hamouda E, El-Metwally S, Tarek M (2018) Ant Lion Optimization algorithm for kidney exchanges. PLoS ONE 13(5):e0196707
Yang, C.S., Chuang, L.Y., Ke, C.H. and Yang, C.H., 2008, June. Boolean binary particle swarm optimization for feature selection. In 2008 IEEE congress on evolutionary computation (IEEE world congress on computational intelligence) (pp. 2093–2098). IEEE.
Zhang X, Mei C, Chen D, Yang Y (2018) A fuzzy rough set-based feature selection method using representative instances. Knowl-Based Syst 151:216–229
Zouache D, Abdelaziz FB (2018) A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection. Comput Ind Eng 115:26–36
Mostafa RR, Gaheen MA, Abd ElAziz M, Al-Betar MA, Ewees AA (2023) An improved gorilla troops optimizer for global optimization problems and feature selection. Knowl-Based Syst 269:110462
Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14
Ghosh KK, Guha R, Bera SK, Kumar N, Sarkar R (2021) S-shaped versus V-shaped transfer functions for binary Manta ray foraging optimization in feature selection problem. Neural Comput Appl 33(17):11027–11041
Saremi S, Mirjalili S, Lewis A (2015) How important is a transfer function in discrete heuristic algorithms. Neural Comput Appl 26:625–640
Francisco MB, Pereira JLJ, Vasconcelos GAVB, da Cunha Jr SS, Gomes GF (2022) November. Multi-objective design optimization of double arrowhead auxetic model using Lichtenberg algorithm based on metamodelling. Structures 45:1199–1211
Merrill FH, Von Hippel A (1939) The atomphysical interpretation of Lichtenberg figures and their application to the study of gas discharge phenomena. J Appl Phys 10(12):873–887
Pereira JLJ, Francisco MB, de Almeida FA, Ma BJ, Cunha SS Jr, Gomes GF (2023) Enhanced Lichtenberg algorithm: a discussion on improving meta-heuristics. Soft Comput 27(21):15619–15647
Hastie, T., Tibshirani, R., Friedman, J.H. and Friedman, J.H., 2009. The elements of statistical learning: data mining, inference, and prediction (Vol. 2, pp. 1–758). New York: springer.
Bello, R., Gomez, Y., Nowe, A. and Garcia, M.M., 2007, October. Two-step particle swarm optimization to solve the feature selection problem. In Seventh international conference on intelligent systems design and applications (ISDA 2007)(pp. 691–696). IEEE.
Kabir MM, Shahjahan M, Murase K (2011) A new local search based hybrid genetic algorithm for feature selection. Neurocomputing 74(17):2914–2928
Aljarah I, Habib M, Faris H, Al-Madi N, Heidari AA, Mafarja M, Abd Elaziz M, Mirjalili S (2020) A dynamic locality multi-objective salp swarm algorithm for feature selection. Comput Ind Eng 147:106628
Abdel-Basset M, Ding W, El-Shahat D (2021) A hybrid Harris Hawks optimization algorithm with simulated annealing for feature selection. Artif Intell Rev 54(1):593–637
Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18
Saary MJ (2008) Radar plots: a useful way for presenting multivariate health care data. J Clin Epidemiol 61(4):311–317
Algamal ZY, Qasim MK, Lee MH, Ali HTM (2020) High-dimensional QSAR/QSPR classification modeling based on improving pigeon optimization algorithm. Chemom Intell Lab Syst 206:104170
Al-Thanoon NA, Algamal ZY, Qasim OS (2021) Feature selection based on a crow search algorithm for big data classification. Chemom Intell Lab Syst 212:104288
Hamed Alnaish ZA, Algamal ZY (2023) Improving binary crow search algorithm for feature selection. J Intell Syst 32(1):20220228
Ewees AA, Al-Qaness MA, Abualigah L, Algamal ZY, Oliva D, Yousri D, Elaziz MA (2023) Enhanced feature selection technique using slime mould algorithm: A case study on chemical data. Neural Comput Appl 35(4):3307–3324
Acknowledgements
The authors acknowledge the financial support from the FAPESP (São Paulo Research Foundation, grants #2023/10419-0, #2022/10683-7, and #2021/06870-3) and FAPEMIG (Fundação de Amparo à Pesquisa do Estado de Minas Gerais, grant APQ-00385-18).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Pereira, J.L.J., Francisco, M.B., Ma, B.J. et al. Golden lichtenberg algorithm: a fibonacci sequence approach applied to feature selection. Neural Comput & Applic 36, 20493–20511 (2024). https://doi.org/10.1007/s00521-024-10155-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-024-10155-9