Authors:
Asma Kharrat
1
;
Fadoua Drira
1
;
Franck Lebourgeois
2
and
Bertrand Kerautret
3
Affiliations:
1
ReGIM-Lab, University of Sfax, ENIS, BP1173, 3038, Sfax, Tunisia
;
2
LIRIS, University of Lyon, INSA-Lyon, CNRS, UMR5205, F-69621, Lyon, France
;
3
LIRIS, University of Lyon, Université Lumière Lyon2, F-69365, Lyon, France
Keyword(s):
Deep Learning, Continual Learning, Natural Language Processing, Catastrophic Forgetting.
Abstract:
Deep learning-based Natural Language Processing (NLP) has advanced significantly over the past decades, in light of static learning’s remarkable performance across a range of text datasets. However, this method heavily relies on static surroundings and predefined datasets, making it difficult to manage ongoing data streams without losing track of previously acquired knowledge. Continual learning provides a more effective and adaptable framework. It tries to make it possible for machine learning models to learn from an ongoing data stream while maintaining their prior knowledge. In the context of NLP, continual learning presents unique challenges and opportunities due to its dynamic and diversity. In this paper, We shall provide a thorough analysis of CL’s most recent advancements in the NLP disciplines in which major challenges are illustrated. We also critically review the existing CL evaluation solutions and benchmarks in NLP. Finally, we present open issues that we consider need f
urther investigations and our outlook on future research directions.
(More)