Self-Aware Neural Network Systems: A Survey and New Perspective | |
Du, Zidong2,3; Guo, Qi2; Zhao, Yongwei2,3; Chen, Yunji1,2,4,5; Xu, Zhiwei2; Zhi, Tian2,3 | |
刊名 | PROCEEDINGS OF THE IEEE |
2020-07-01 | |
卷号 | 108期号:7页码:1047-1067 |
关键词 | Artificial neural networks Neurons Self-aware Monitoring Sensors Program processors Logic gates Self-aware neural network (NN) processors |
ISSN号 | 0018-9219 |
DOI | 10.1109/JPROC.2020.2977722 |
英文摘要 | Neural network (NN) processors are specially designed to handle deep learning tasks by utilizing multilayer artificial NNs. They have been demonstrated to be useful in broad application fields such as image recognition, speech processing, machine translation, and scientific computing. Meanwhile, innovative self-aware techniques, whereby a system can dynamically react based on continuously sensed information from the execution environment, have attracted attention from both academia and industry. Actually, various self-aware techniques have been applied to NN systems to significantly improve the computational speed and energy efficiency. This article surveys state-of-the-art self-aware NN systems (SaNNSs), which can be achieved at different layers, that is, the architectural layer, the physical layer, and the circuit layer. At the architectural layer, SaNNS can be characterized from a data-centric perspective where different data properties (i.e., data value, data precision, dataflow, and data distribution) are exploited. At the physical layer, various parameters of physical implementation are considered. At the circuit layer, different logics and devices can be used for high efficiency. In fact, the self-awareness of existing SaNNS is still in a preliminary form. We propose a comprehensive SaNNS from a new perspective, that is, the model layer, to exploit more opportunities for high efficiency. The proposed system is called as MinMaxNN, which features model switching and elastic sparsity based on monitored information from the execution environment. The model switching mechanism implies that models (i.e., min and max model) dynamically switch given different inputs for both efficiency and accuracy. The elastic sparsity mechanism indicates that the sparsity of NNs can be dynamically adjusted in each layer for efficiency. The experimental results show that compared with traditional SaNNS, MinMaxNN can achieve $5.64\times $ and 19.66% performance improvement and energy reduction, respectively, without notable loss of accuracy and negative effects on developers' productivity. |
资助项目 | National Key Research and Development Program of China[2017YFB1003101] ; National Key Research and Development Program of China[2018AAA0103300] ; National Key Research and Development Program of China[2017YFA0700900] ; National Key Research and Development Program of China[2017YFA0700902] ; National Key Research and Development Program of China[2017YFA0700901] ; NSF of China[61732007] ; NSF of China[61432016] ; NSF of China[61532016] ; NSF of China[61672491] ; NSF of China[61602441] ; NSF of China[61602446] ; NSF of China[61732002] ; NSF of China[61702478] ; NSF of China[61732020] ; Beijing Natural Science Foundation[JQ18013] ; National Science and Technology Major Project[2018ZX01031102] ; Transformation and Transfer of Scientific and Technological Achievements of Chinese Academy of Sciences[KFJ-HGZX-013] ; Key Research Projects in Frontier Science of Chinese Academy of Sciences[QYZDB-SSW-JSC001] ; Strategic Priority Research Program of Chinese Academy of Science[XDB32050200] ; Strategic Priority Research Program of Chinese Academy of Science[XDC01020000] ; Standardization Research Project of Chinese Academy of Sciences[BZ201800001] ; Beijing Academy of Artificial Intelligence (BAAI) through the Beijing Nova Program of Science and Technology[Z191100001119093] |
WOS研究方向 | Engineering |
语种 | 英语 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
WOS记录号 | WOS:000544973500006 |
内容类型 | 期刊论文 |
源URL | [http://119.78.100.204/handle/2XEOYT63/15041] |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Chen, Yunji |
作者单位 | 1.Shanghai Res Ctr Brain Sci & Brain Inspired Intel, Inst Brain Intelligence Technol Zhangjiang Lab BI, ZJLAB, Shanghai 201210, Peoples R China 2.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China 3.Cambricon Technol, Beijing 100037, Peoples R China 4.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100864, Peoples R China 5.CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China |
推荐引用方式 GB/T 7714 | Du, Zidong,Guo, Qi,Zhao, Yongwei,et al. Self-Aware Neural Network Systems: A Survey and New Perspective[J]. PROCEEDINGS OF THE IEEE,2020,108(7):1047-1067. |
APA | Du, Zidong,Guo, Qi,Zhao, Yongwei,Chen, Yunji,Xu, Zhiwei,&Zhi, Tian.(2020).Self-Aware Neural Network Systems: A Survey and New Perspective.PROCEEDINGS OF THE IEEE,108(7),1047-1067. |
MLA | Du, Zidong,et al."Self-Aware Neural Network Systems: A Survey and New Perspective".PROCEEDINGS OF THE IEEE 108.7(2020):1047-1067. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论