Calibration for non-exemplar based class-incremental learning
Fei Zhu; Xu-Yao Zhang; Cheng-Lin Liu
2021
会议日期July 5-9, 2021
会议地点Shenzhen, China
英文摘要

Catastrophic forgetting is the central challenge in incremental learning. Notable studies address the problem by using regularization or experience replay strategies. However, the performance is far from ideal without storing previous data, especially in the scenario of class-incremental learning (CIL). In CIL setting, an important factor causing catastrophic forgetting is the severe bias between the new and previously learned classes, in both classifier and feature extractor. In this paper, we propose calibrateCIL which contains two simple modifications to calibrate the bias in non-exemplar based CIL. Specifically, local softmax is proposed to calibrate the classifier, and cutout training is used to calibrate the feature extractor by learning richer, more generalizable and transferable features. Our method can give balance class scores without any post-processing technique. We show that our method outperforms state-of-the-art non-exemplar based methods on the challenging problem of CIL, and the ablation study demonstrates the effectiveness of the two modifications.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/47478]  
专题自动化研究所_模式识别国家重点实验室_模式分析与学习团队
作者单位中科院自动化所
推荐引用方式
GB/T 7714
Fei Zhu,Xu-Yao Zhang,Cheng-Lin Liu. Calibration for non-exemplar based class-incremental learning[C]. 见:. Shenzhen, China. July 5-9, 2021.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace