TY - JOUR
T1 - Learning to Continually Learn Rapidly from Few and Noisy Data
AU - I-Hsien Kuo, Nicholas
AU - Harandi, Mehrtash
AU - Fourrier, Nicolas
AU - Walder, Christian
AU - Ferraro, Gabriela
AU - Suominen, Hanna
N1 - Publisher Copyright:
Copyright © The authors and PMLR 2023.
PY - 2021
Y1 - 2021
N2 - Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay { by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learn-ing rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.
AB - Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay { by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learn-ing rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.
UR - http://www.scopus.com/inward/record.url?scp=85171442685&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85171442685
SN - 2640-3498
VL - 140
SP - 65
EP - 76
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 2021 AAAI Workshop on Meta-Learning and MetaDL Challenge
Y2 - 9 February 2021
ER -