A Preliminary Approach to Semi-supervised Learning in Convolutional Neural Networks Applying “Sleep-Wake” Cycles | SpringerLink
Skip to main content

A Preliminary Approach to Semi-supervised Learning in Convolutional Neural Networks Applying “Sleep-Wake” Cycles

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10637))

Included in the following conference series:

  • 4268 Accesses

Abstract

The scarcity of labeled data has limited the capacity of convolutional neural networks (CNNs) until not long ago and still represents a serious problem in a number of image processing applications. Unsupervised methods have been shown to perform well in feature extraction and clustering tasks, but further investigation on unsupervised solutions for CNNs is needed. In this work, we propose a bio-inspired methodology that applies a deep generative model to help the CNN take advantage of unlabeled data and improve its classification performance. Inspired by the human “sleep-wake cycles”, the proposed method divides the learning process into sleep and waking periods. During the waking period, both the generative model and the CNN learn from real training data simultaneously. When sleep begins, none of the networks receive real data and the generative model creates a synthetic dataset from which the CNN learns. The experimental results showed that the generative model was able to teach the CNN and improve its classification performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Diekelmann, S., Born, J.: The memory function of sleep. Nat. Rev. Neurosci. 11(2), 114–126 (2010)

    Google Scholar 

  2. Franklin, M.S., Zyphur, M.J.: The role of dreams in the evolution of the human mind. Evol. Psychol. 3(1), 59–78 (2005)

    Article  Google Scholar 

  3. Geib, B.R., Stanley, M.L., Dennis, N.A., Woldorff, M.G., Cabeza, R.: From hippocampus to whole-brain: the role of integrative processing in episodic memory retrieval. Hum. Brain Mapp. 38(4), 2242–2259 (2017)

    Article  Google Scholar 

  4. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition, pp. 770–778, January 2016

    Google Scholar 

  5. Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: ICLR (2014)

    Google Scholar 

  6. Kingma, D., Rezende, D., Mohamed, S., Welling, M.: Semi-supervised learning with deep generative models. NIPS 4, 3581–3589 (2014)

    Google Scholar 

  7. Krizhevsky, A., Sutskever, I., Hinton, G.: ImageNet classification with deep convolutional neural networks, vol. 2, pp. 1097–1105 (2012)

    Google Scholar 

  8. Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)

    Article  Google Scholar 

  9. LeCun, Y., Cortes, C.: The MNIST database of handwritten digits (1998)

    Google Scholar 

  10. McClelland, J.L., McNaughton, B.L., O’Reilly, R.C.: Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol. Rev. 102(3), 419–457 (1995)

    Article  Google Scholar 

  11. Rasmus, A., Valpola, H., Honkala, M., Berglund, M., Raiko, T.: Semi-supervised learning with ladder networks. In: NIPS, pp. 3546–3554, January 2015

    Google Scholar 

  12. Shinozaki, T.: Semi-supervised learning for convolutional neural networks using mild supervisory signals. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9950, pp. 381–388. Springer, Cham (2016). doi:10.1007/978-3-319-46681-1_46

    Chapter  Google Scholar 

  13. Wamsley, E.J.: Dreaming and offline memory consolidation. Curr. Neurol. Neurosci. Rep. 14(3), 433 (2014)

    Article  Google Scholar 

  14. Zhang, Y., Lee, K., Lee, H.: Augmenting supervised neural networks with unsupervised objectives for large-scale image classification, vol. 2, pp. 939–957 (2016)

    Google Scholar 

Download references

Acknowledgments

This work has been partially supported by the Spanish Ministry of Science and Technology under the project TIN2016-77356-P (AEI/FEDER, UE).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mikel Elkano .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Elkano, M., Bustince, H., Paplinski, A. (2017). A Preliminary Approach to Semi-supervised Learning in Convolutional Neural Networks Applying “Sleep-Wake” Cycles. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10637. Springer, Cham. https://doi.org/10.1007/978-3-319-70093-9_49

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-70093-9_49

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-70092-2

  • Online ISBN: 978-3-319-70093-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics