全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

表情和姿态的双模态情感识别

DOI: 10.11834/jig.20130906

Keywords: 表情,姿态,双模态情感识别,空时特征,双边稀疏偏最小二乘(BSPLS)

Full-Text   Cite this paper   Add to My Lib

Abstract:

多模态情感识别是当前情感计算研究领域的重要内容,针对人脸表情和动作姿态开展双模态情感识别研究,提出一种基于双边稀疏偏最小二乘的表情和姿态的双模态情感识别方法。首先,从视频图像系列中分别提取表情和姿态两种模态的空时特征作为情感特征矢量。然后,通过双边稀疏偏最小二乘(BSPLS)的数据降维方法来进一步提取两组模态中的情感特征,并组合成新的情感特征向量。最后,采用了两种分类器来进行情感的分类识别。以国际上广泛采用的FABO表情和姿态的双模态情感数据库为实验数据,并与多种子空间方法(主成分分析、典型相关分析、偏最小二乘回归)进行对比实验来评估本文方法的识别性能。实验结果表明,两种模态融合后相比单模态更加有效,双边稀疏偏最小二乘(BSPLS)算法在几种方法中得到最高的情感识别率。

References

[1]  Zeng Z, Pantic M, Roism G, et al. A survey of affect recognition methods: audio, visual and spontaneous expressions[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, 31(1): 39-58.
[2]  Valstar ME, Jiang BH. Mehu M, et al. The first facial expre- ssion recognition and analysis challenge[C]//Proceedings of IEEE International Conference on.Automatic Face and Gesture Recognition. Washington: IEEE Computer Society, 2011: 314-321.
[3]  Bai X, Pei Y, Ma L, et al. Automatic facial expression recognition using gabor filter and expression analysis[C]//Proceedings of International Conference on Computer Modeling and Simulation. Washington: IEEE Computer Society, 2010,2: 215-218.
[4]  Chena S, Tiana Y, Liu Q, et al. Recognizing expressions from face and body gesture by temporal normalized motion and appearance features[J]. Image and Vision Computing, 2012, 31(2).175-785.
[5]  Ayadi M, Kamel M, Karray F. Survey on speech emotion recognition: features, classification schemes, and databases[J]. Pa- ttern Recognition, 2011,44(3): 572-587.
[6]  Hudlicka E. To feel or not to feel: the role of affect in human-computer interaction[J]. Int. J. Hum.-Comput. Stud., 2003,59(1-2):1-32.
[7]  Meeren H, Heijnsbergen C, Gelder B. Rapid perceptual integration of facial expression and emotional body language[C]//Proceedings of the National Academy of Sciences of USA. Washington: United States National Academy of Sciences, 2005,102(45): 16518-16523.
[8]  Coulson M. Attributing emotion to static body postures: Recognition accuracy, confusions,and viewpoint dependence[J]. Journal of Nonverbal Behavior, 2004, 28(2):117-139.
[9]  Camurri A, Lagerl?f I, Volpe G. Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques[J]. International Journal of Human-Computer Studies, 2003, 59(1-2): 213-225.
[10]  Silva P, Osano M, Marasinghe A. Towards recognizing emotion with affective dimensions through body gestures[C]//Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition. Washington: IEEE Computer Society, 2006, 269-274.
[11]  Ambady N, Rosenthal R. Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis[J]. Psychological Bulletin, 1992, 111(2):256-274.
[12]  Kapoor A, Picard R. Multimodal affect recognition in learning environments[C]//Proceedings of the 13th Annual ACM International Conference on Multimedia. New York: ACM 2005. 677-682.
[13]  Shan C, Gong S,G MOwan P W. Beyond facial expressions:Learning human emotion from body gestures[C]//Proceedings of Brit. Mach.Vis. Conf..Mamchestor British: BMVA Press, 2007: 43(1-10).
[14]  Chen S, Tian Y, Liu Q, et al. Recognizing expressions from face and body gesture by temporal normalized motion and appearance features[C]//Proceedings of 24th IEEE Conference on Computer Vision and Pattern Recognition. Washington: IEEE Computer Society, 2011.7-12.
[15]  Gunes H, Piccardi M. A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior[C]//Proceedings of the 18th Internatioual conference on pattern Recognition. Washington: IEEE Computer Society, 2006,1: 1148-1153.
[16]  Gunes H, Piccardi M. Bi-modal emotion recognition from expre- ssive face and body gestures[J]. Journal of Network and Computer Applications, 2007, 30(4):1334-1345.
[17]  Gunes H, Piccardi M. Fusing face and body gesture for machine recognition of emotions[C]//Proceedings of IEEE International.Workshop Robots and Human Interactive Communication. Washington: IEEE Computer Society, 2005: 306-311.
[18]  Gunes H, Piccardi M. Affect recognition from face and body: early fusion vs. late fusion[C]//Proceedings of IEEE SMC. Washington: IEEE Computer Society, 2005,4: 3437-3443.
[19]  Ekman P, Friesen W. Facial Action Coding System: A Technique for the Measurement of Facial Movement[M]. Palo Alto: Consulting Psychologists Press, 1978.
[20]  Ryoo M, Aggarwal J. Spatio-temporal relationship match: Video structure comparison for recognition of complex human activities[C]//Proceedings of International Conference on Computer Vision. Washington: IEEE Computer Society, 2009(5):1593-1600
[21]  更多...
[22]  Krishnan A, Williams L J, McIntosh AR, et al. Partial Least Squares(PLS) methods for neuroimaging: a tutorial and review[J]. Neuroimage, 2011,23(2):302-332.
[23]  Cao K, Rossouw D. A sparse PLS for variable selection when integrating omics data[J]. Statistical Applications in Genetics and Molecular Biology, 2008,5(7):389-422.
[24]  Yan J, Xin M. Facial expression recognition based on fused spatio-temporal features[C]//Proceedings of the 2nd International Conference on Computer Science and Electronics Engineering. Paris, France: Atlantis Press, 2013: 2113-2116.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133