Abstract
Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay { by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learn-ing rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates.
| Original language | English |
|---|---|
| Pages (from-to) | 65-76 |
| Number of pages | 12 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 140 |
| Publication status | Published - 2021 |
| Event | 2021 AAAI Workshop on Meta-Learning and MetaDL Challenge - Virtual, Online Duration: 9 Feb 2021 → … |
Fingerprint
Dive into the research topics of 'Learning to Continually Learn Rapidly from Few and Noisy Data'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver