Fast Adaboost training algorithm by dynamic weight trimming | |
Jia Hui-Xing ; Zhang Yu-Jin | |
2010-10-12 ; 2010-10-12 | |
关键词 | Theoretical or Mathematical/ learning (artificial intelligence) sampling methods/ Adaboost training algorithm dynamic weight trimming large dataset weak classifier performance/ C1230L Learning in AI C1140Z Other topics in statistics |
中文摘要 | This paper presents a novel fast Adaboost training algorithm by dynamic weight trimming, which increases the training speed greatly when dealing with large datasets. At each iteration, the algorithm discards most of the samples with small weight and keeps only the samples with large weight to train the weak classifier. Then it checks the performance of the weak classifier on all the samples, if the weighted error is above 0.5, it will increase the number of training samples and retrain the weak classifier. During training, only a small portion of the samples are used to train the weak classifier, so the speed is increased greatly. |
语种 | 中文 |
出版者 | Science Press ; China |
内容类型 | 期刊论文 |
源URL | [http://hdl.handle.net/123456789/82159] ![]() |
专题 | 清华大学 |
推荐引用方式 GB/T 7714 | Jia Hui-Xing,Zhang Yu-Jin. Fast Adaboost training algorithm by dynamic weight trimming[J],2010, 2010. |
APA | Jia Hui-Xing,&Zhang Yu-Jin.(2010).Fast Adaboost training algorithm by dynamic weight trimming.. |
MLA | Jia Hui-Xing,et al."Fast Adaboost training algorithm by dynamic weight trimming".(2010). |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论