You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization | |
Zhang, Xinbang1,2; Huang, Zehao3; Wang, Naiyan3; Xiang, Shiming1,2; Pan, Chunhong1 | |
刊名 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE |
2021-09-01 | |
卷号 | 43期号:9页码:2891-2904 |
关键词 | Computer architecture Optimization Learning (artificial intelligence) Task analysis Acceleration Evolutionary computation Convolution Neural architecture search(NAS) convolution neural network sparse optimization |
ISSN号 | 0162-8828 |
DOI | 10.1109/TPAMI.2020.3020300 |
通讯作者 | Zhang, Xinbang(xinbang.zhang@nlpr.ia.ac.cn) |
英文摘要 | Recently neural architecture search (NAS) has raised great interest in both academia and industry. However, it remains challenging because of its huge and non-continuous search space. Instead of applying evolutionary algorithm or reinforcement learning as previous works, this paper proposes a direct sparse optimization NAS (DSO-NAS) method. The motivation behind DSO-NAS is to address the task in the view of model pruning. To achieve this goal, we start from a completely connected block, and then introduce scaling factors to scale the information flow between operations. Next, sparse regularizations are imposed to prune useless connections in the architecture. Lastly, an efficient and theoretically sound optimization method is derived to solve it. Our method enjoys both advantages of differentiability and efficiency, therefore it can be directly applied to large datasets like ImageNet and tasks beyond classification. Particularly, on the CIFAR-10 dataset, DSO-NAS achieves an average test error 2.74 percent, while on the ImageNet dataset DSO-NAS achieves 25.4 percent test error under 600M FLOPs with 8 GPUs in 18 hours. As for semantic segmentation task, DSO-NAS also achieve competitive result compared with manually designed architectures on the PASCAL VOC dataset. Code is available at https://github.com/XinbangZhang/DSO-NAS. |
资助项目 | Major Project for New Generation of AI[2018AAA0100400] ; National Natural Science Foundation of China[91646207] ; National Natural Science Foundation of China[61976208] |
WOS关键词 | NETWORKS ; ALGORITHM ; GAME ; GO |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
出版者 | IEEE COMPUTER SOC |
WOS记录号 | WOS:000681124300006 |
资助机构 | Major Project for New Generation of AI ; National Natural Science Foundation of China |
内容类型 | 期刊论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/45631] |
专题 | 自动化研究所_模式识别国家重点实验室_遥感图像处理团队 |
通讯作者 | Zhang, Xinbang |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Dept Natl Lab Pattern Recognit, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China 3.Tusimple, Beijing 100020, Peoples R China |
推荐引用方式 GB/T 7714 | Zhang, Xinbang,Huang, Zehao,Wang, Naiyan,et al. You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2021,43(9):2891-2904. |
APA | Zhang, Xinbang,Huang, Zehao,Wang, Naiyan,Xiang, Shiming,&Pan, Chunhong.(2021).You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,43(9),2891-2904. |
MLA | Zhang, Xinbang,et al."You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 43.9(2021):2891-2904. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论