Emotional head motion predicting from prosodic and linguistic features | |
Yang, Minghao1![]() ![]() | |
刊名 | MULTIMEDIA TOOLS AND APPLICATIONS
![]() |
2016-05-01 | |
卷号 | 75期号:9页码:5125-5146 |
关键词 | Visual Prosody Head Gesture Prosody Clustering |
DOI | 10.1007/s11042-016-3405-3 |
文献子类 | Article |
英文摘要 | Emotional head motion plays an important role in human-computer interaction (HCI), which is one of the important factors to improve users' experience in HCI. However, it is still not clear how head motions are influenced by speech features in different emotion states. In this study, we aim to construct a bimodal mapping model from speech to head motions, and try to discover what kinds of prosodic and linguistic features have the most significant influence on emotional head motions. A two-layer clustering schema is introduced to obtain reliable clusters from head motion parameters. With these clusters, an emotion related speech to head gesture mapping model is constructed by a Classification and Regression Tree (CART). Based on the statistic results of CART, a systematical statistic map of the relationship between speech features (including prosodic and linguistic features) and head gestures is presented. The map reveals the features which have the most significant influence on head motions in long or short utterances. We also make an analysis on how linguistic features contribute to different emotional expressions. The discussions in this work provide important references for realistic animation of speech driven talking-head or avatar. |
WOS关键词 | AFFECT RECOGNITION ; EXPRESSIONS ; ANIMATION ; DRIVEN ; FACE |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000376601700018 |
资助机构 | National High-Tech Research and Development Program of China (863 Program)(2015AA016305) ; National Natural Science Foundation of China (NSFC)(61332017 ; 61375027 ; 61203258 ; 61273288 ; 61233009 ; 61425017) |
内容类型 | 期刊论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/12229] ![]() |
专题 | 自动化研究所_模式识别国家重点实验室_人机语音交互团队 |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China 2.Univ Int Business & Econ, Sch Int Studies, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,et al. Emotional head motion predicting from prosodic and linguistic features[J]. MULTIMEDIA TOOLS AND APPLICATIONS,2016,75(9):5125-5146. |
APA | Yang, Minghao,Jiang, Jinlin,Tao, Jianhua,Mu, Kaihui,&Li, Hao.(2016).Emotional head motion predicting from prosodic and linguistic features.MULTIMEDIA TOOLS AND APPLICATIONS,75(9),5125-5146. |
MLA | Yang, Minghao,et al."Emotional head motion predicting from prosodic and linguistic features".MULTIMEDIA TOOLS AND APPLICATIONS 75.9(2016):5125-5146. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论