Energy Efficiency of Python Machine Learning Frameworks | SpringerLink
Skip to main content

Energy Efficiency of Python Machine Learning Frameworks

  • Conference paper
  • First Online:
Intelligent Systems Design and Applications (ISDA 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 715))

  • 400 Accesses

Abstract

Although machine learning (ML) is a field that has been the subject of research for decades, a large number of applications with high computational power have recently emerged. Usually, we only focus on solving machine learning problems without considering how much energy has been consumed by the different frameworks used for such applications. This study aims to provide a comparison among four widely used frameworks such as Tensorflow, Keras, Pytorch, and Scikit-learn in terms of many aspects, including energy efficiency, memory usage, execution time, and accuracy. We monitor the performance of such frameworks using different well-known machine learning benchmark problems. Our results show interesting findings, such as slower and faster frameworks consuming less or more energy, higher or lower memory usage, etc. We show how to use our results to provide machine learning developers with information to decide which framework to use for their applications when energy efficiency is a concern.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 26311
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 32889
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Aakash, N., Sayak, P.M.M.R.: Keras: high-level neural networks API. https://keras.io/ (2018). Accessed 20 Oct 2018

  2. Ahmed, F.S., et al.: A hybrid machine learning framework to predict mortality in paralytic ileus patients using electronic health records (EHRs). J. Ambient Intell. Human Comput. 12, 685–693 (2021)

    Google Scholar 

  3. Ali, R.B., Ejbali, R., Zaied, M.: A deep convolutional neural wavelet network for classification of medical images, vol. 14, pp. 1488–1498. Science Publications (2018). https://doi.org/10.3844/jcssp.2018.1488.1498

  4. Alzubaidi, L., et al.: Review of deep learning: concepts, CNN architectures, challenges, applications, future directions, vol. 8 (2021)

    Google Scholar 

  5. Bahrampour, S., Ramakrishnan, N., Schott, L., Shah, M.: Comparative study of deep learning software frameworks (2015)

    Google Scholar 

  6. Chintala, S.: Convnet-benchmarks. https://mse238blog.stanford.edu/2017/07/gnakhare/hardware-options-for-machinedeep-learning. https://github.com/soumith/convnet-benchmarks/ (2015). Accessed 30 Oct 2015

  7. Deng, L.: The MNIST database of handwritten digit images for machine learning research, vol. 29, pp. 141–142 (2012). https://doi.org/10.1109/MSP.2012.2211477

  8. Elman, J.: Distributed representations, simple recurrent networks, and grammatical structure, vol. 7 (1993)

    Google Scholar 

  9. Gevorkyan, M.N., Demidova, A.V., Demidova, T.S., Sobolev, A.A.: Review and comparative analysis of machine learning libraries for machine learning, vol. 27, pp. 305–315 (2019)

    Google Scholar 

  10. Jia, Y., et al.: Caffe: convolutional architecture for fast feature embedding, vol. abs/1408.5093 (2014). http://arxiv.org/abs/1408.5093

  11. Nguyen, G., et al.: Machine learning and deep learning frameworks and libraries for large-scale data mining: a survey. Artif. Intell. Rev. 52(1), 77–124 (2019). https://doi.org/10.1007/s10462-018-09679-z

    Article  Google Scholar 

  12. Pereira, R., et al.: Energy efficiency across programming languages: how do energy, time, and memory related. In: Proceedings of the 10th ACM SIGPLAN International Conference on Software Language Engineering, pp. 256–267 (2017)

    Google Scholar 

  13. Phaladisailoed, T., Numnonda, T.: Machine learning models comparison for bitcoin price prediction. In: 2018 10th International Conference on Information Technology and Electrical Engineering (ICITEE), pp. 506–511 (2018)

    Google Scholar 

  14. PyTorch: PyTorch deep learning framework that puts python first. http://pytorch.org/ (2018). Accessed 20 Oct 2018

  15. Said, S., Jemai, O., Hassairi, S., Ejbali, R., Zaied, M., Ben Amar, C.: Deep wavelet network for image classification. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 922–927 (2016)

    Google Scholar 

  16. Shatnawi, A., Al-Bdour, G., Al-Qurran, R., Al-Ayyoub, M.: A comparative study of open source deep learning frameworks. In: 2018 9th International Conference on Information and Communication Systems (ICICS), pp. 72–77 (2018). https://doi.org/10.1109/IACS.2018.8355444

  17. Shi, S., Wang, Q., Xu, P., Chu, X.: Benchmarking state-of-the-art deep learning software tools. In: 2016 7th International Conference on Cloud Computing and Big Data (CCBD), pp. 99–104 (2016). https://doi.org/10.1109/CCBD.2016.029

  18. Shorten, C., Khoshgoftaar, T.M.: A survey on image data augmentation for deep learning. J. Big Data 6(1), 1–48 (2019). https://doi.org/10.1186/s40537-019-0197-0

    Article  Google Scholar 

  19. Wu, Y., et al.: A comparative measurement study of deep learning as a service framework. IEEE Trans. Services Comput. 15, 551–566 (2022). https://doi.org/10.1109/TSC.2019.2928551

Download references

Acknowledgement

We want to thank the Ministry of Higher Education and Gabes University for facilitating the travel of Salwa Ajel to Portugal, the HASLab/INESC TEC, Universidade do Minho (Portugal) for the technical support of the work, and the Erasmus Jamies for accepting Salwa Ajel’s application.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Salwa Ajel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ajel, S., Ribeiro, F., Ejbali, R., Saraiva, J. (2023). Energy Efficiency of Python Machine Learning Frameworks. In: Abraham, A., Pllana, S., Casalino, G., Ma, K., Bajaj, A. (eds) Intelligent Systems Design and Applications. ISDA 2022. Lecture Notes in Networks and Systems, vol 715. Springer, Cham. https://doi.org/10.1007/978-3-031-35507-3_57

Download citation

Publish with us

Policies and ethics