Embracing Change: Continual Learning in Deep Neural Networks
- PMID: 33158755
- DOI: 10.1016/j.tics.2020.09.004
Embracing Change: Continual Learning in Deep Neural Networks
Abstract
Artificial intelligence research has seen enormous progress over the past few decades, but it predominantly relies on fixed datasets and stationary environments. Continual learning is an increasingly relevant area of study that asks how artificial systems might learn sequentially, as biological systems do, from a continuous stream of correlated data. In the present review, we relate continual learning to the learning dynamics of neural networks, highlighting the potential it has to considerably improve data efficiency. We further consider the many new biologically inspired approaches that have emerged in recent years, focusing on those that utilize regularization, modularity, memory, and meta-learning, and highlight some of the most promising and impactful directions.
Keywords: artificial intelligence; lifelong; memory; meta-learning; non-stationary.
Copyright © 2020 The Authors. Published by Elsevier Ltd.. All rights reserved.
Similar articles
-
Schematic memory persistence and transience for efficient and robust continual learning.Neural Netw. 2021 Dec;144:49-60. doi: 10.1016/j.neunet.2021.08.011. Epub 2021 Aug 13. Neural Netw. 2021. PMID: 34450446
-
Continual lifelong learning with neural networks: A review.Neural Netw. 2019 May;113:54-71. doi: 10.1016/j.neunet.2019.01.012. Epub 2019 Feb 6. Neural Netw. 2019. PMID: 30780045 Review.
-
Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2010-2022. doi: 10.1109/TNNLS.2021.3099700. Epub 2022 May 2. IEEE Trans Neural Netw Learn Syst. 2022. PMID: 34339377
-
Emergence and reconfiguration of modular structure for artificial neural networks during continual familiarity detection.Sci Adv. 2024 Jul 26;10(30):eadm8430. doi: 10.1126/sciadv.adm8430. Epub 2024 Jul 26. Sci Adv. 2024. PMID: 39058783 Free PMC article.
-
Efficient, continual, and generalized learning in the brain - neural mechanism of Mental Schema 2.0.Rev Neurosci. 2023 Mar 27;34(8):839-868. doi: 10.1515/revneuro-2022-0137. Print 2023 Dec 15. Rev Neurosci. 2023. PMID: 36960579 Review.
Cited by
-
Reconciling shared versus context-specific information in a neural network model of latent causes.Sci Rep. 2024 Jul 22;14(1):16782. doi: 10.1038/s41598-024-64272-5. Sci Rep. 2024. PMID: 39039131 Free PMC article.
-
Pareto optimality, economy-effectiveness trade-offs and ion channel degeneracy: improving population modelling for single neurons.Open Biol. 2022 Jul;12(7):220073. doi: 10.1098/rsob.220073. Epub 2022 Jul 13. Open Biol. 2022. PMID: 35857898 Free PMC article. Review.
-
A survey and perspective on neuromorphic continual learning systems.Front Neurosci. 2023 May 4;17:1149410. doi: 10.3389/fnins.2023.1149410. eCollection 2023. Front Neurosci. 2023. PMID: 37214407 Free PMC article. Review.
-
Robotics Dexterous Grasping: The Methods Based on Point Cloud and Deep Learning.Front Neurorobot. 2021 Jun 9;15:658280. doi: 10.3389/fnbot.2021.658280. eCollection 2021. Front Neurorobot. 2021. PMID: 34177509 Free PMC article. Review.
-
The challenges of lifelong learning in biological and artificial systems.Trends Cogn Sci. 2022 Dec;26(12):1051-1053. doi: 10.1016/j.tics.2022.09.022. Epub 2022 Nov 2. Trends Cogn Sci. 2022. PMID: 36335012 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous