CORC  > 北京大学  > 信息科学技术学院
Entropy regularized likelihood learning on Gaussian mixture: Two gradient implementations for automatic model selection
Lu, Zhiwu
刊名neural processing letters
2007
关键词competitive learning Gaussian mixture model selection regularization theory CURVE DETECTION ALGORITHM
DOI10.1007/s11063-006-9028-3
英文摘要In Gaussian mixture modeling, it is crucial to select the number of Gaussians or mixture model for a sample data set. Under regularization theory, we aim to solve this kind of model selection problem through implementing entropy regularized likelihood (ERL) learning on Gaussian mixture via a batch gradient learning algorithm. It is demonstrated by the simulation experiments that this gradient ERL learning algorithm can select an appropriate number of Gaussians automatically during the parameter learning on a sample data set and lead to a good estimation of the parameters in the actual Gaussian mixture, even in the cases of two or more actual Gaussians overlapped strongly. We further give an adaptive gradient implementation of the ERL learning on Gaussian mixture followed with theoretic analysis, and find a mechanism of generalized competitive learning implied in the ERL learning.; Computer Science, Artificial Intelligence; Neurosciences; SCI(E); EI; 0; ARTICLE; 1; 17-30; 25
语种英语
内容类型期刊论文
源URL[http://ir.pku.edu.cn/handle/20.500.11897/161907]  
专题信息科学技术学院
推荐引用方式
GB/T 7714
Lu, Zhiwu. Entropy regularized likelihood learning on Gaussian mixture: Two gradient implementations for automatic model selection[J]. neural processing letters,2007.
APA Lu, Zhiwu.(2007).Entropy regularized likelihood learning on Gaussian mixture: Two gradient implementations for automatic model selection.neural processing letters.
MLA Lu, Zhiwu."Entropy regularized likelihood learning on Gaussian mixture: Two gradient implementations for automatic model selection".neural processing letters (2007).
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace