TP-ADMM: An Efficient Two-Stage Framework for Training Binary Neural Networks
Yong Yuan1,2; Chen Chen1,2; Xiyuan Hu1,2; Silong Peng1,2,3
2019
会议日期2019
会议地点Sydney, Australia
英文摘要

Deep Neural Networks (DNNs) are very powerful and successful but suffer from high computation and memory cost. As a useful attempt, binary neural networks represent weights and activations with binary values, which can significantly reduce resource consumption. How- ever, the simultaneous binarization introduces the coupling effect, aggravating the difficulty of training. In this paper, we develop a novel framework named TP-ADMM that decouples the binarization process into two iteratively optimized stages. Firstly, we propose an improved target propagation method to optimize the network with binary activations in a more stable format. Secondly, we apply the alternating direction method (ADMM) with a varying penalty to get the weights binarized, making weights binarization a discretely constrained optimization problem. Experiments on three public datasets for image classification show that the proposed method outperforms the existing methods.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/25819]  
专题自动化研究所_智能制造技术与系统研究中心_多维数据分析团队
通讯作者Chen Chen
作者单位1.Institute of Automation, Chinese Academy of Sciences, Beijing, China
2.University of Chinese Academy of Sciences, Beijing, China
3.Beijing ViSystem Corporation Limited, China
推荐引用方式
GB/T 7714
Yong Yuan,Chen Chen,Xiyuan Hu,et al. TP-ADMM: An Efficient Two-Stage Framework for Training Binary Neural Networks[C]. 见:. Sydney, Australia. 2019.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace