Abstract
In recent years, deep learning has made significant progress in computer vision. However, most studies focus on the algorithms in real field. Complex number can have richer characterization capabilities and use fewer training parameters. This paper proposes a complex-valued densely connected convolutional network, which is complex-valued DenseNets. It generalizes the network structure of real-valued DenseNets to complex field and constructs the basic architectures including complex dense block and complex transition layers. Experiments were performed on the CIFAR-10 database and CIFAR-100 datasets. Experimental results show the proposed algorithm has a lower error rate and fewer parameters than real-valued DenseNets and complex-valued ResNets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: Proceedings of the 32nd International Conference on Machine Learning, pp. 448–456 (2015)
Srivastava, R., Greff, K., Schmidhuber, J.: Training very deep networks. In: Proceedings of the 2015 Neural Information Processing Systems, pp. 2377–2385 (2015)
He, K.M., Zhang, X.Y., Ren, S.Q., et al.: Deep residual learning for image recognition. In: Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Huang, G., Liu, Z., Van Der Maaten, L., et al.: Densely connected convolutional networks. In: Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2261–2268 (2017)
Gaudet, C.J., Maida, A.S.: Deep quaternion networks. In: Proceedings of the 2018 International Joint Conference on Neural Networks, pp. 1–8 (2018)
Hirose, A., Yoshida, S.: Generalization characteristics of complex-valued feedforward neural networks in relation to signal coherence. IEEE Trans. Neural Netw. Learn. Syst. 23(4), 541–551 (2012)
Arjovsky, M., Shah, A., Bengio, Y.: Unitary evolution recurrent neural networks. In: Proceedings of the 33rd International Conference on Machine Learning, pp. 1120–1128 (2016)
Danihelka, I., Wayne, G., Uria, B., et al.: Associative long short-term memory. In: Proceedings of the 33rd International Conference on Machine Learning, pp. 1986–1994 (2016)
Wisdom, S., Powers, T., Hershey, J., et al.: Full-capacity unitary recurrent neural networks. In: Proceedings of the 2016 Neural Information Processing Systems, pp. 4880–4888 (2016)
Trabelsi, C., Bilaniuk, O., Zhang, Y., et al.: Deep complex networks. In: Proceedings of the 2018 International Conference on Learning Representations, pp. 94–102 (2018)
Reichert, D.P., Serre, T.: Neuronal synchrony in complex-valued deep networks. In: Proceedings of the 2014 International Conference on Learning Representations, pp. 68–76 (2014)
Oppenheim, A.V., Lim, J.S.: The importance of phase in signals. Proc. IEEE 69(5), 529–541 (1981)
Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)
Georgiou, G.M., Koutsougeras, C.: Complex domain backpropagation. IEEE Trans. Circ. Syst. II: Analog Digit. Signal Process. 39(5), 330–334 (1992)
Zemel, R.S., Williams, C.K., Mozer, M.C.: Lending direction to neural networks. Neural Netw. 8(4), 503–512 (1995)
Kim, T., Adali, T.: Approximation by fully complex multilayer perceptrons. Neural Comput. 15(7), 1641–1666 (2003)
Oyallon, E., Mallat, S.: Deep roto-translation scattering for object classification. In: Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2865–2873 (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Li, W., Xie, W., Wang, Z. (2020). Complex-Valued Densely Connected Convolutional Networks. In: Zeng, J., Jing, W., Song, X., Lu, Z. (eds) Data Science. ICPCSEE 2020. Communications in Computer and Information Science, vol 1257. Springer, Singapore. https://doi.org/10.1007/978-981-15-7981-3_21
Download citation
DOI: https://doi.org/10.1007/978-981-15-7981-3_21
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-7980-6
Online ISBN: 978-981-15-7981-3
eBook Packages: Computer ScienceComputer Science (R0)