Triple-Memory Networks: A Brain-Inspired Method for Continual Learning
- PMID: 34529579
- DOI: 10.1109/TNNLS.2021.3111019
Triple-Memory Networks: A Brain-Inspired Method for Continual Learning
Abstract
Continual acquisition of novel experience without interfering with previously learned knowledge, i.e., continual learning, is critical for artificial neural networks, while limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task but then fails to conduct the old tasks well. By contrast, the biological brain can effectively address catastrophic forgetting through consolidating memories as more specific or more generalized forms to complement each other, which is achieved in the interplay of the hippocampus and neocortex, mediated by the prefrontal cortex. Inspired by such a brain strategy, we propose a novel approach named triple-memory networks (TMNs) for continual learning. TMNs model the interplay of the three brain regions as a triple-network architecture of generative adversarial networks (GANs). The input information is encoded as specific representations of data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in the generation process. TMNs achieve the state-of-the-art performance of generative memory replay on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10, and ImageNet-50.
Similar articles
-
Brain-inspired replay for continual learning with artificial neural networks.Nat Commun. 2020 Aug 13;11(1):4069. doi: 10.1038/s41467-020-17866-2. Nat Commun. 2020. PMID: 32792531 Free PMC article.
-
Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2010-2022. doi: 10.1109/TNNLS.2021.3099700. Epub 2022 May 2. IEEE Trans Neural Netw Learn Syst. 2022. PMID: 34339377
-
GopGAN: Gradients Orthogonal Projection Generative Adversarial Network With Continual Learning.IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):215-227. doi: 10.1109/TNNLS.2021.3093319. Epub 2023 Jan 5. IEEE Trans Neural Netw Learn Syst. 2023. PMID: 34270433
-
Contributions by metaplasticity to solving the Catastrophic Forgetting Problem.Trends Neurosci. 2022 Sep;45(9):656-666. doi: 10.1016/j.tins.2022.06.002. Epub 2022 Jul 4. Trends Neurosci. 2022. PMID: 35798611 Review.
-
Continual lifelong learning with neural networks: A review.Neural Netw. 2019 May;113:54-71. doi: 10.1016/j.neunet.2019.01.012. Epub 2019 Feb 6. Neural Netw. 2019. PMID: 30780045 Review.
Cited by
-
Coffee With a Hint of Data: Towards Using Data-Driven Approaches in Personalised Long-Term Interactions.Front Robot AI. 2021 Sep 28;8:676814. doi: 10.3389/frobt.2021.676814. eCollection 2021. Front Robot AI. 2021. PMID: 34651017 Free PMC article.
-
Distinctive properties of biological neural networks and recent advances in bottom-up approaches toward a better biologically plausible neural network.Front Comput Neurosci. 2023 Jun 28;17:1092185. doi: 10.3389/fncom.2023.1092185. eCollection 2023. Front Comput Neurosci. 2023. PMID: 37449083 Free PMC article. Review.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources