Scalable Wide Neural Network: A Parallel, Incremental Learning Model Using Splitting Iterative Least Squares
Xi, Jiangbo2; Ersoy, Okan K.3; Fang, Jianwu4; Cong, Ming5; Wei, Xin1; Wu, Tianjun6
刊名IEEE Access
2021
卷号9页码:50767-50781
关键词Wide neural network least squares fast training incremental learning
ISSN号21693536
DOI10.1109/ACCESS.2021.3068880
产权排序5
英文摘要

With the rapid development of research on machine learning models, especially deep learning, more and more endeavors have been made on designing new learning models with properties such as fast training with good convergence, and incremental learning to overcome catastrophic forgetting. In this paper, we propose a scalable wide neural network (SWNN), composed of multiple multi-channel wide RBF neural networks (MWRBF). The MWRBF neural network focuses on different regions of data and nonlinear transformations can be performed with Gaussian kernels. The number of MWRBFs for proposed SWNN is decided by the scale and difficulty of learning tasks. The splitting and iterative least squares (SILS) training method is proposed to make the training process easy with large and high dimensional data. Because the least squares method can find pretty good weights during the first iteration, only a few succeeding iterations are needed to fine tune the SWNN. Experiments were performed on different datasets including gray and colored MNIST data, hyperspectral remote sensing data (KSC, Pavia Center, Pavia University, and Salinas), and compared with main stream learning models. The results show that the proposed SWNN is highly competitive with the other models. CCBY

语种英语
出版者Institute of Electrical and Electronics Engineers Inc.
WOS记录号WOS:000638390700001
内容类型期刊论文
源URL[http://ir.opt.ac.cn/handle/181661/94603]  
专题西安光学精密机械研究所_空间光学应用研究室
作者单位1.Xi'an Institute of Optics and Precision Mechanics of CAS, Xi'an 710119, China and University of Chinese Academy of Sciences, Beijing 100039, China.;
2.College of Geological Engineering and Geomatics, Chang'an University, Xi'an 710054, China and Key Laboratory of Western China's Mineral Resources and Geological Engineering, Ministry of Education, Xi'an 710054, China. (e-mail: xijiangbo@chd.edu.cn);
3.Purdue University, West Lafayette, IN 47907, USA.;
4.College of Transportation Engineering, Chang'an University, Xi'an 710064, China.;
5.College of Geological Engineering and Geomatics, Chang'an University, Xi'an 710054, China and Key Laboratory of Western China's Mineral Resources and Geological Engineering, Ministry of Education, Xi'an 710054, China.;
6.Department of Mathematics and Information Science, College of Science, Chang'an University, Xi'an 710064, China.
推荐引用方式
GB/T 7714
Xi, Jiangbo,Ersoy, Okan K.,Fang, Jianwu,et al. Scalable Wide Neural Network: A Parallel, Incremental Learning Model Using Splitting Iterative Least Squares[J]. IEEE Access,2021,9:50767-50781.
APA Xi, Jiangbo,Ersoy, Okan K.,Fang, Jianwu,Cong, Ming,Wei, Xin,&Wu, Tianjun.(2021).Scalable Wide Neural Network: A Parallel, Incremental Learning Model Using Splitting Iterative Least Squares.IEEE Access,9,50767-50781.
MLA Xi, Jiangbo,et al."Scalable Wide Neural Network: A Parallel, Incremental Learning Model Using Splitting Iterative Least Squares".IEEE Access 9(2021):50767-50781.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace