Abstract
In this paper, we propose a multi-objective evolutionary algorithm for automatic deep neural architecture search. The algorithm optimizes the performance of the model together with the number of network parameters. This allows exploring architectures that are both successful and compact. We test the proposed solution on several image classification data sets including MNIST, fashionMNIST and CIFAR-10, and we consider deep architectures including convolutional and fully connected networks. The effects of using two different versions of multi-objective selections are also examined in the paper. Our approach outperforms both the considered baseline architectures and the standard genetic algorithm used in our previous work.
This work was partially supported by the Czech Science Foundation project no. 18-23827S and institutional support of the Institute of Computer Science RVO 67985807.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bergstra, J., Yamins, D., Cox, D.D.: Making a science of model search: hyperparameter optimization in hundreds of dimensions for vision architectures. In: Proceedings of the 30th International Conference on Machine Learning (ICML 2013) (2013)
Chollet, F.: Keras (2015). https://github.com/fchollet/keras
Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-points-based nondominated sorting approach. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002). https://doi.org/10.1109/4235.996017
Elsken, T., Metzen, J.H., Hutter, F.: Efficient multi-objective neural architecture search via Lamarckian evolution. In: International Conference on Learning Representations (2019). https://openreview.net/forum?id=ByME42AqK7
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20(55), 1–21 (2019). http://jmlr.org/papers/v20/18-598.html
Gong, C., Jiang, Z., Wang, D., Lin, Y., Liu, Q., Pan, D.Z.: Mixed precision neural architecture search for energy efficient deep learning. In: 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 1–7 (2019)
Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press (2016). http://www.deeplearningbook.org
Guo, Y., et al.: NAT: neural architecture transformer for accurate and compact architectures. In: NeurIPS (2019)
Goodfellow, I., et al.: TensorFlow: large-scale machine learning on heterogeneous systems (2015). Software available from tensorflow.org. https://www.tensorflow.org/
Jin, H., Song, Q., Hu, X.: Auto-Keras: an efficient neural architecture search system. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1946–1956. ACM (2019)
Keras + Hyperopt: A very simple wrapper for convenient hyperparameter optimization. http://maxpumperla.com/hyperas/
Krizhevsky, A., Nair, V., Hinton, G.: CIFAR-10 (Canadian Institute for Advanced Research). http://www.cs.toronto.edu/kriz/cifar.html
Lecun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015). https://doi.org/10.1038/nature14539
Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998). https://doi.org/10.1109/5.726791
LeCun, Y., Cortes, C.: The MNIST database of handwritten digits (2012). http://research.microsoft.com/apps/pubs/default.aspx?id=204699
Lu, Z., Deb, K., Boddeti, V.: MUXConv: information multiplexing in convolutional neural networks, pp. 12041–12050, June 2020. https://doi.org/10.1109/CVPR42600.2020.01206
Miikkulainen, R., et al.: Evolving deep neural networks. CoRR abs/1703.00548 (2017). http://arxiv.org/abs/1703.00548
Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002). http://nn.cs.utexas.edu/?stanley:ec02
Autonomio Talos [computer software] (2019). http://github.com/autonomio/talos
Vidnerová, P., Neruda, R.: Evolution strategies for deep neural network models design. In: Hlaváčová, J. (ed.) Proceedings ITAT 2017: Information Technologies - Applications and Theory. CEUR Workshop Proceedings, vol. 1885, pp. 159–166. Technical University & CreateSpace Independent Publishing Platform, Aachen & Charleston (2017)
Vidnerová, P., Neruda, R.: Evolving Keras architectures for sensor data analysis. In: 2017 Federated Conference on Computer Science and Information Systems (FedCSIS), pp. 109–112, September 2017. https://doi.org/10.15439/2017F241
Vidnerová, P., Procházka, Š.: NSGA-Keras: Neural architecture search for Keras sequential models (2020). https://github.com/PetraVidnerova/nsga-keras
Vidnerová, P., Neruda, R.: Asynchronous evolution of convolutional networks. In: Krajci, S. (ed.) Proceedings of the 18th Conference Information Technologies - Applications and Theory (ITAT 2018), Hotel Plejsy, Slovakia, 21–25 September 2018. CEUR Workshop Proceedings, vol. 2203, pp. 80–85. CEUR-WS.org (2018). http://ceur-ws.org/Vol-2203/80.pdf
Xia, W., Yin, H., Jha, N.: Efficient synthesis of compact deep neural networks. CoRR abs/2004.08704 (2020). http://arxiv.org/abs/2004.08704
Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Vidnerová, P., Neruda, R. (2020). Multi-objective Evolution for Deep Neural Network Architecture Search. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Lecture Notes in Computer Science(), vol 12534. Springer, Cham. https://doi.org/10.1007/978-3-030-63836-8_23
Download citation
DOI: https://doi.org/10.1007/978-3-030-63836-8_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63835-1
Online ISBN: 978-3-030-63836-8
eBook Packages: Computer ScienceComputer Science (R0)