Rethinking the pid optimizer for stochastic optimization of deep networks
Shi, Lei2,4; Zhang, Yifan2,4; Wang, Wanguo3; Cheng, Jian1,2,4; Lu, Hanqing2,4
2020
会议日期July 6, 2020 - July 10, 2020
会议地点London, United kingdom
英文摘要

Stochastic gradient descent with momentum (SGD-Momentum) always causes the overshoot problem due to the integral action of the momentum term. Recently, an ID optimizer is proposed to solve the overshoot problem with the help of derivative information. However, the derivative term suffers from the interference of the high-frequency noise, especially for the stochastic gradient descent method that uses minibatch data in each update step. In this work, we propose a complete PID optimizer, which weakens the effect of the D term and adds a P term to more stably alleviate the overshoot problem. To further reduce the interference of the high-frequency noise, two effective and efficient methods are proposed to stabilize the training process. Extensive experiments on three widely used benchmark datasets with different scales, i.e., MNIST, Cifar10 and TinyImageNet, demonstrate the superiority of our proposed PID optimizer on various popular deep neural networks.
© 2020 IEEE.

会议录出版者IEEE Computer Society
语种英语
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/42212]  
专题自动化研究所_模式识别国家重点实验室_图像与视频分析团队
通讯作者Shi, Lei
作者单位1.CAS Center for Excellence in Brain Science and Intelligence Technology, China
2.Institute of Automation, Chinese Academy of Sciences, NLPR AIRIA, China
3.State Grid Intelligence Technology Co. Ltd., China
4.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing; 100049, China
推荐引用方式
GB/T 7714
Shi, Lei,Zhang, Yifan,Wang, Wanguo,et al. Rethinking the pid optimizer for stochastic optimization of deep networks[C]. 见:. London, United kingdom. July 6, 2020 - July 10, 2020.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace