{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,2,15]],"date-time":"2024-02-15T11:59:32Z","timestamp":1707998372045},"reference-count":12,"publisher":"MIT Press - Journals","issue":"3","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Neural Computation"],"published-print":{"date-parts":[[2019,3]]},"abstract":" This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of [Formula: see text]-many skip connections into network architectures, such as residual networks and additive dense networks, defines [Formula: see text]th order dynamical equations on the layer-wise transformations. Closed-form solutions for the state-space representations of general [Formula: see text]th order additive dense networks, where the concatenation operation is replaced by addition, as well as [Formula: see text]th order smooth networks, are found. The developed provision endows deep neural networks with an algebraic structure. Furthermore, it is shown that imposing [Formula: see text]th order smoothness on network architectures with [Formula: see text]-many nodes per layer increases the state-space dimension by a multiple of [Formula: see text], and so the effective embedding dimension of the data manifold by the neural network is [Formula: see text]-many dimensions. It follows that network architectures of these types reduce the number of parameters needed to maintain the same embedding dimension by a factor of [Formula: see text] when compared to an equivalent first-order, residual network. Numerical simulations and experiments on CIFAR10, SVHN, and MNIST have been conducted to help understand the developed theory and efficacy of the proposed concepts. <\/jats:p>","DOI":"10.1162\/neco_a_01165","type":"journal-article","created":{"date-parts":[[2019,1,15]],"date-time":"2019-01-15T18:18:06Z","timestamp":1547576286000},"page":"538-554","source":"Crossref","is-referenced-by-count":8,"title":["State-Space Representations of Deep Neural Networks"],"prefix":"10.1162","volume":"31","author":[{"given":"Michael","family":"Hauser","sequence":"first","affiliation":[{"name":"Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A."}]},{"given":"Sean","family":"Gunn","sequence":"additional","affiliation":[{"name":"Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A."}]},{"suffix":"Jr.","given":"Samer","family":"Saab","sequence":"additional","affiliation":[{"name":"Department of Electrical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A."}]},{"given":"Asok","family":"Ray","sequence":"additional","affiliation":[{"name":"Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A."}]}],"member":"281","reference":[{"key":"B2","author":"Chang B.","year":"2017","journal-title":"Reversible architectures for arbitrarily deep residual neural networks"},{"key":"B3","author":"Chang B.","year":"2017","journal-title":"Multi-level residual networks from dynamical systems view"},{"key":"B4","doi-asserted-by":"publisher","DOI":"10.1088\/1361-6420\/aa9a90"},{"key":"B5","first-page":"2804","volume-title":"Advances in neural information processing systems","author":"Hauser M.","year":"2017"},{"key":"B6","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2016.90"},{"key":"B7","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2017.243"},{"key":"B8","author":"Kingma D. P.","year":"2014","journal-title":"Adam: A method for stochastic optimization"},{"key":"B9","author":"Lin H.","year":"2018","journal-title":"Resnet with one-neuron hidden layers is a universal approximator"},{"key":"B10","author":"Lu Y.","year":"2017","journal-title":"Beyond finite layer neural networks: Bridging deep architectures and numerical differential equations"},{"key":"B11","author":"Proctinger H.","year":"1993","journal-title":"Some information about the binomial transform"},{"key":"B12","doi-asserted-by":"publisher","DOI":"10.21236\/ADA164453"},{"key":"B13","first-page":"550","volume-title":"Advances in neural information processing systems","author":"Veit A.","year":"2016"}],"container-title":["Neural Computation"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mitpressjournals.org\/doi\/pdf\/10.1162\/neco_a_01165","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2021,3,12]],"date-time":"2021-03-12T21:43:00Z","timestamp":1615585380000},"score":1,"resource":{"primary":{"URL":"https:\/\/direct.mit.edu\/neco\/article\/31\/3\/538-554\/8453"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,3]]},"references-count":12,"journal-issue":{"issue":"3","published-print":{"date-parts":[[2019,3]]}},"alternative-id":["10.1162\/neco_a_01165"],"URL":"https:\/\/doi.org\/10.1162\/neco_a_01165","relation":{},"ISSN":["0899-7667","1530-888X"],"issn-type":[{"value":"0899-7667","type":"print"},{"value":"1530-888X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2019,3]]}}}