Abstract
The scarcity of labeled data has limited the capacity of convolutional neural networks (CNNs) until not long ago and still represents a serious problem in a number of image processing applications. Unsupervised methods have been shown to perform well in feature extraction and clustering tasks, but further investigation on unsupervised solutions for CNNs is needed. In this work, we propose a bio-inspired methodology that applies a deep generative model to help the CNN take advantage of unlabeled data and improve its classification performance. Inspired by the human “sleep-wake cycles”, the proposed method divides the learning process into sleep and waking periods. During the waking period, both the generative model and the CNN learn from real training data simultaneously. When sleep begins, none of the networks receive real data and the generative model creates a synthetic dataset from which the CNN learns. The experimental results showed that the generative model was able to teach the CNN and improve its classification performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Diekelmann, S., Born, J.: The memory function of sleep. Nat. Rev. Neurosci. 11(2), 114–126 (2010)
Franklin, M.S., Zyphur, M.J.: The role of dreams in the evolution of the human mind. Evol. Psychol. 3(1), 59–78 (2005)
Geib, B.R., Stanley, M.L., Dennis, N.A., Woldorff, M.G., Cabeza, R.: From hippocampus to whole-brain: the role of integrative processing in episodic memory retrieval. Hum. Brain Mapp. 38(4), 2242–2259 (2017)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition, pp. 770–778, January 2016
Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: ICLR (2014)
Kingma, D., Rezende, D., Mohamed, S., Welling, M.: Semi-supervised learning with deep generative models. NIPS 4, 3581–3589 (2014)
Krizhevsky, A., Sutskever, I., Hinton, G.: ImageNet classification with deep convolutional neural networks, vol. 2, pp. 1097–1105 (2012)
Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)
LeCun, Y., Cortes, C.: The MNIST database of handwritten digits (1998)
McClelland, J.L., McNaughton, B.L., O’Reilly, R.C.: Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol. Rev. 102(3), 419–457 (1995)
Rasmus, A., Valpola, H., Honkala, M., Berglund, M., Raiko, T.: Semi-supervised learning with ladder networks. In: NIPS, pp. 3546–3554, January 2015
Shinozaki, T.: Semi-supervised learning for convolutional neural networks using mild supervisory signals. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9950, pp. 381–388. Springer, Cham (2016). doi:10.1007/978-3-319-46681-1_46
Wamsley, E.J.: Dreaming and offline memory consolidation. Curr. Neurol. Neurosci. Rep. 14(3), 433 (2014)
Zhang, Y., Lee, K., Lee, H.: Augmenting supervised neural networks with unsupervised objectives for large-scale image classification, vol. 2, pp. 939–957 (2016)
Acknowledgments
This work has been partially supported by the Spanish Ministry of Science and Technology under the project TIN2016-77356-P (AEI/FEDER, UE).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Elkano, M., Bustince, H., Paplinski, A. (2017). A Preliminary Approach to Semi-supervised Learning in Convolutional Neural Networks Applying “Sleep-Wake” Cycles. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10637. Springer, Cham. https://doi.org/10.1007/978-3-319-70093-9_49
Download citation
DOI: https://doi.org/10.1007/978-3-319-70093-9_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70092-2
Online ISBN: 978-3-319-70093-9
eBook Packages: Computer ScienceComputer Science (R0)