Loading...
Recoverable Memory Bank For Class-Incremental Learning
Kong, Jiangtao
Kong, Jiangtao
Abstract
Incremental learning aims to enable machine learning systems to sequentially learn new tasks without forgetting the old ones. While some existing methods, such as data replay-based and parameter isolation-based approaches, achieve remarkable results in incremental learning, they often suffer from memory limits, privacy issues, or generation instability. To address these problems, we propose Recoverable Memory Bank (RMB), a novel non-exemplar-based approach for class incremental learning (CIL). Specifically, we design a dynamic memory bank that stores only one aggregated memory representing each class of the old tasks. Next, we propose a novel method that combines a high-dimensional space rotation matrix and Gaussian distribution to maximally recover the features of previous tasks from aggregated memories in the dynamic memory bank. After that, we use the recovered features together with the features in the current task to optimize the backbone and classifier. To balance the performance of our model between the current task and previous ones, we also adopt meta-learning to update the model, which promotes its capability of learning the current task effectively. Finally, the evaluation results on multiple benchmark datasets demonstrate that the proposed RMB significantly outperforms existing non-exemplar-based methods in terms of accuracy. Importantly, it can achieve comparable performance to some exemplar-based methods.
Description
Date
2023-01-01
Journal Title
Journal ISSN
Volume Title
Publisher
Collections
Download Dataset
Files
Rights Holder
Usage License
Embargo
Research Projects
Organizational Units
Journal Issue
Keywords
Citation
Advisor
Department
Computer Science
DOI
https://dx.doi.org/10.21220/s2-0xn6-pt70
