Balanced knowledge distillation for long-tailed learning
Zhang, Shaoyu1,2; Chen, Chen1,2; Hu, Xiyuan3; Peng, Silong1,2,4
刊名NEUROCOMPUTING
2023-03-28
卷号527页码:36-46
关键词Long-tailed learning Knowledge distillation Vision and text classification
ISSN号0925-2312
DOI10.1016/j.neucom.2023.01.063
通讯作者Chen, Chen(chen.chen@ia.ac.cn)
英文摘要Deep models trained on long-tailed datasets exhibit unsatisfactory performance on tail classes. Existing methods usually modify the classification loss to increase the learning focus on tail classes, which unex-pectedly sacrifice the performance on head classes. In fact, this scheme leads to a contradiction between the two goals of long-tailed learning, i.e., learning generalizable representations and facilitating learning for tail classes. In this work, we explore knowledge distillation in long-tailed scenarios and propose a novel distillation framework, named Balanced Knowledge Distillation (BKD), to disentangle the contradic-tion between the two goals and achieve both simultaneously. Specifically, given a teacher model, we train the student model by minimizing the combination of an instance-balanced classification loss and a class-balanced distillation loss. The former benefits from the sample diversity and learns generalizable repre-sentation, while the latter considers the class priors and facilitates learning for tail classes. We conduct extensive experiments on several long-tailed benchmark datasets and demonstrate that the proposed BKD is an effective knowledge distillation framework in long-tailed scenarios, as well as a competitive method for long-tailed learning. Our source code is available: https://github.com/EricZsy/ BalancedKnowledgeDistillation.& COPY; 2023 Elsevier B.V. All rights reserved.
资助项目National Key Ramp;D Program of China[NSFC 61906194] ; National Science Foundation of China ; [2021YFF0602101]
WOS关键词SMOTE
WOS研究方向Computer Science
语种英语
出版者ELSEVIER
WOS记录号WOS:001054164700001
资助机构National Key Ramp;D Program of China ; National Science Foundation of China
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/54140]  
专题自动化研究所_智能制造技术与系统研究中心_多维数据分析团队
通讯作者Chen, Chen
作者单位1.Univ Chinese Acad Sci, Beijing, Peoples R China
2.Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
3.Nanjing Univ Sci & Technol, Nanjing, Peoples R China
4.Beijing ViSystem Co Ltd, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Zhang, Shaoyu,Chen, Chen,Hu, Xiyuan,et al. Balanced knowledge distillation for long-tailed learning[J]. NEUROCOMPUTING,2023,527:36-46.
APA Zhang, Shaoyu,Chen, Chen,Hu, Xiyuan,&Peng, Silong.(2023).Balanced knowledge distillation for long-tailed learning.NEUROCOMPUTING,527,36-46.
MLA Zhang, Shaoyu,et al."Balanced knowledge distillation for long-tailed learning".NEUROCOMPUTING 527(2023):36-46.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace