Embracing Change: Continual Learning in Deep Neural Networks - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Dec;24(12):1028-1040.
doi: 10.1016/j.tics.2020.09.004. Epub 2020 Nov 3.

Embracing Change: Continual Learning in Deep Neural Networks

Affiliations
Free article
Review

Embracing Change: Continual Learning in Deep Neural Networks

Raia Hadsell et al. Trends Cogn Sci. 2020 Dec.
Free article

Abstract

Artificial intelligence research has seen enormous progress over the past few decades, but it predominantly relies on fixed datasets and stationary environments. Continual learning is an increasingly relevant area of study that asks how artificial systems might learn sequentially, as biological systems do, from a continuous stream of correlated data. In the present review, we relate continual learning to the learning dynamics of neural networks, highlighting the potential it has to considerably improve data efficiency. We further consider the many new biologically inspired approaches that have emerged in recent years, focusing on those that utilize regularization, modularity, memory, and meta-learning, and highlight some of the most promising and impactful directions.

Keywords: artificial intelligence; lifelong; memory; meta-learning; non-stationary.

PubMed Disclaimer

Similar articles

Cited by

LinkOut - more resources