Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework
- PMID: 26928718
- PMCID: PMC4771709
- DOI: 10.1371/journal.pcbi.1004792
Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework
Abstract
The ability to simultaneously record from large numbers of neurons in behaving animals has ushered in a new era for the study of the neural circuit mechanisms underlying cognitive functions. One promising approach to uncovering the dynamical and computational principles governing population responses is to analyze model recurrent neural networks (RNNs) that have been optimized to perform the same tasks as behaving animals. Because the optimization of network parameters specifies the desired output but not the manner in which to achieve this output, "trained" networks serve as a source of mechanistic hypotheses and a testing ground for data analyses that link neural computation to behavior. Complete access to the activity and connectivity of the circuit, and the ability to manipulate them arbitrarily, make trained networks a convenient proxy for biological circuits and a valuable platform for theoretical investigation. However, existing RNNs lack basic biological features such as the distinction between excitatory and inhibitory units (Dale's principle), which are essential if RNNs are to provide insights into the operation of biological circuits. Moreover, trained networks can achieve the same behavioral performance but differ substantially in their structure and dynamics, highlighting the need for a simple and flexible framework for the exploratory training of RNNs. Here, we describe a framework for gradient descent-based training of excitatory-inhibitory RNNs that can incorporate a variety of biological knowledge. We provide an implementation based on the machine learning library Theano, whose automatic differentiation capabilities facilitate modifications and extensions. We validate this framework by applying it to well-known experimental paradigms such as perceptual decision-making, context-dependent integration, multisensory integration, parametric working memory, and motor sequence generation. Our results demonstrate the wide range of neural activity patterns and behavior that can be modeled, and suggest a unified setting in which diverse cognitive computations and mechanisms can be studied.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures
Similar articles
-
Considerations in using recurrent neural networks to probe neural dynamics.J Neurophysiol. 2019 Dec 1;122(6):2504-2521. doi: 10.1152/jn.00467.2018. Epub 2019 Oct 16. J Neurophysiol. 2019. PMID: 31619125
-
PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.eNeuro. 2021 Jan 15;8(1):ENEURO.0427-20.2020. doi: 10.1523/ENEURO.0427-20.2020. Print 2021 Jan-Feb. eNeuro. 2021. PMID: 33328247 Free PMC article.
-
Training dynamically balanced excitatory-inhibitory networks.PLoS One. 2019 Aug 8;14(8):e0220547. doi: 10.1371/journal.pone.0220547. eCollection 2019. PLoS One. 2019. PMID: 31393909 Free PMC article.
-
Recurrent neural networks as versatile tools of neuroscience research.Curr Opin Neurobiol. 2017 Oct;46:1-6. doi: 10.1016/j.conb.2017.06.003. Epub 2017 Jun 29. Curr Opin Neurobiol. 2017. PMID: 28668365 Review.
-
Performance of a Computational Model of the Mammalian Olfactory System.In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 6. In: Persaud KC, Marco S, Gutiérrez-Gálvez A, editors. Neuromorphic Olfaction. Boca Raton (FL): CRC Press/Taylor & Francis; 2013. Chapter 6. PMID: 26042330 Free Books & Documents. Review.
Cited by
-
Two views on the cognitive brain.Nat Rev Neurosci. 2021 Jun;22(6):359-371. doi: 10.1038/s41583-021-00448-6. Epub 2021 Apr 15. Nat Rev Neurosci. 2021. PMID: 33859408 Review.
-
Computational modeling of human multisensory spatial representation by a neural architecture.PLoS One. 2023 Mar 8;18(3):e0280987. doi: 10.1371/journal.pone.0280987. eCollection 2023. PLoS One. 2023. PMID: 36888612 Free PMC article.
-
Computational mechanisms of distributed value representations and mixed learning strategies.Nat Commun. 2021 Dec 10;12(1):7191. doi: 10.1038/s41467-021-27413-2. Nat Commun. 2021. PMID: 34893597 Free PMC article.
-
Understanding the computation of time using neural network models.Proc Natl Acad Sci U S A. 2020 May 12;117(19):10530-10540. doi: 10.1073/pnas.1921609117. Epub 2020 Apr 27. Proc Natl Acad Sci U S A. 2020. PMID: 32341153 Free PMC article.
-
Multi-context blind source separation by error-gated Hebbian rule.Sci Rep. 2019 May 9;9(1):7127. doi: 10.1038/s41598-019-43423-z. Sci Rep. 2019. PMID: 31073206 Free PMC article.
References
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources