2025
1.
Yang, Qinwen; Wang, Liyuan; Wicker, Jörg; Dobbie, Gillian
Continual Learning: A Systematic Literature Review Journal Article
In: Neural Networks, pp. 108226, 2025, ISSN: 0893-6080.
Abstract | Links | BibTeX | Altmetric | PlumX | Tags: catastrophic forgetting, continual learning, incremental learning, lifelong learning, machine learning, systematic literature review
@article{Yang2025continual,
title = {Continual Learning: A Systematic Literature Review},
author = {Qinwen Yang and Liyuan Wang and J\"{o}rg Wicker and Gillian Dobbie },
url = {ttps://www.sciencedirect.com/science/article/pii/S0893608025011074},
doi = {10.1016/j.neunet.2025.108226},
issn = {0893-6080},
year = {2025},
date = {2025-10-20},
journal = {Neural Networks},
pages = {108226},
abstract = {In today’s data-driven landscape, intelligent systems must continuously learn from evolving data streams while preserving previously acquired knowledge. Continual learning (CL) has emerged as a key paradigm to meet this need, offering the potential to build adaptable and resilient AI systems. Despite its promise, CL faces core challenges such as catastrophic forgetting (CF), which occurs when learning new tasks degrades prior knowledge. These challenges have spurred rapid growth in CL research and method development. This paper presents a systematic literature review (SLR) of CL research published from 2010 to 2025. We adopt a reproducible methodology that combines automated searches across five major databases with manual searches from 13 key venues, resulting in a curated set of 189 peer-reviewed papers. Our review fills key gaps in earlier surveys by unifying fragmented classifications, probing overlooked causes of catastrophic forgetting, and advancing a more principled theoretical framing of CL challenges beyond empirical benchmarking. We contribute by offering (1) an updated, comprehensive overview of CL research; (2) refined taxonomies of learning methods and settings; (3) an analysis of the structural and functional causes of CF; and (4) the identification of open theoretical questions and emerging research directions. We provide an open-source repository of the reviewed literature and associated resources to support ongoing work: https://github.com/yyy24601/Continual-Learning-Paper.},
keywords = {catastrophic forgetting, continual learning, incremental learning, lifelong learning, machine learning, systematic literature review},
pubstate = {published},
tppubtype = {article}
}
In today’s data-driven landscape, intelligent systems must continuously learn from evolving data streams while preserving previously acquired knowledge. Continual learning (CL) has emerged as a key paradigm to meet this need, offering the potential to build adaptable and resilient AI systems. Despite its promise, CL faces core challenges such as catastrophic forgetting (CF), which occurs when learning new tasks degrades prior knowledge. These challenges have spurred rapid growth in CL research and method development. This paper presents a systematic literature review (SLR) of CL research published from 2010 to 2025. We adopt a reproducible methodology that combines automated searches across five major databases with manual searches from 13 key venues, resulting in a curated set of 189 peer-reviewed papers. Our review fills key gaps in earlier surveys by unifying fragmented classifications, probing overlooked causes of catastrophic forgetting, and advancing a more principled theoretical framing of CL challenges beyond empirical benchmarking. We contribute by offering (1) an updated, comprehensive overview of CL research; (2) refined taxonomies of learning methods and settings; (3) an analysis of the structural and functional causes of CF; and (4) the identification of open theoretical questions and emerging research directions. We provide an open-source repository of the reviewed literature and associated resources to support ongoing work: https://github.com/yyy24601/Continual-Learning-Paper.
