全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

维度情感模型研究现状综述
A Review of the Research Status of Dimensional Emotion Model

DOI: 10.12677/ap.2024.143158, PP. 270-278

Keywords: 维度情感模型,维度情感标注,情感语音数据库
Dimensional Emotion Model
, Dimensional Emotion Labeling, Emotional Speech Database

Full-Text   Cite this paper   Add to My Lib

Abstract:

本文对维度情感模型的研究现状和进展进行了归纳总结,包括维度情感描述模型、维度情感标注方法、具有代表性的维度情感语音数据库三个方面,旨在尽可能全面地对维度情感模型进行详细的介绍,为相关研究人员提供有价值的学术参考,最后,针对目前的研究现状总结维度情感模型的优势和不足之处并对未来的发展进行了展望。
This paper summarizes the research status and progress of dimensional emotion model, including three aspects: dimensional emotion description model, dimensional emotion annotation method, and representative dimensional emotion speech database, aiming to make a detailed introduction to dimensional emotion model as comprehensively as possible and provide valuable academic reference for relevant researchers. Finally, according to the current research situation, the advantages and disadvantages of the dimensional emotion model are summarized, and the future development is prospected.

References

[1]  乐国安, 董颖红(2013). 情绪的基本结构: 争论, 应用及其前瞻. 南开学报: 哲学社会科学版, (1), 140-150.
[2]  李晓明, 傅小兰, 邓国峰(2008). 中文简化版PAD情绪量表在京大学生中的初步试用. 中国心理卫生杂志, 22(5), 327-329.
[3]  Arifin, S., & Cheung, P. Y. (2008). Affective Level Video Segmentation by Utilizing the Pleasure-Arousal-Dominance Information. IEEE Transactions on Multimedia, 10, 1325-1341.
https://doi.org/10.1109/TMM.2008.2004911
[4]  Busso, C., Bulut, M., Lee, C. C., Kazemzadeh, A., Mower, E., Kim, S. et al. (2008). IEMOCAP: Interactive Emotional Dyadic Motion Capture Database. Language Resources and Evaluation, 42, 335-359.
https://doi.org/10.1007/s10579-008-9076-6
[5]  Cowie, R., Douglas-Cowie, E., Savvidou, S., McMahon, E., Sawey, M., & Schr?der, M. (2000). ‘FEELTRACE’: An Instrument for Recording Perceived Emotion in Real Time. In ISCA Tutorial and Research Workshop (ITRW) on Speech and Emotion (pp. 19-24).
[6]  Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine, 18, 32-80.
https://doi.org/10.1109/79.911197
[7]  D’Mello, S. K., & Kory, J. (2015). A Review and Meta-Analysis of Multimodal Affect Detection Systems. ACM Computing Surveys (CSUR), 47, 1-36.
https://doi.org/10.1145/2682899
[8]  Grimm, M., Kroschel, K., & Narayanan, S. (2008). The Vera am Mittag German Audio-Visual Emotional Speech Database. In 2008 IEEE International Conference on Multimedia and Expo (pp. 865-868). IEEE.
https://doi.org/10.1109/ICME.2008.4607572
[9]  Gunes, H., & Schuller, B. (2013). Categorical and Dimensional Affect Analysis in Continuous Input: Current Trends and Future Directions. Image and Vision Computing, 31, 120-136.
https://doi.org/10.1016/j.imavis.2012.06.016
[10]  Koelstra, S., Muhl, C., Soleymani, M., Lee, J. S., Yazdani, A., Ebrahimi, T. et al. (2011). DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Transactions on Affective Computing, 3, 18-31.
https://doi.org/10.1109/T-AFFC.2011.15
[11]  McKeown, G., Valstar, M., Cowie, R., Pantic, M., & Schroder, M. (2011). The Semaine Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent. IEEE Transactions on Affective Computing, 3, 5-17.
https://doi.org/10.1109/T-AFFC.2011.20
[12]  Mehrabian, A. (1995). Framework for a Comprehensive Description and Measurement of Emotional States. Genetic, Social, and General Psychology Monographs, 121, 339-361.
[13]  Mehrabian, A., & Russell, J. A. (1974). An Approach to Environmental Psychology. The MIT Press.
[14]  Morris, J. D. (1995). Observations: SAM: The Self-Assessment Manikin; An Efficient Cross-Cultural Measurement of Emotional Response. Journal of Advertising Research, 35, 63-68.
[15]  Osgood, C. E. (1966). Dimensionality of the Semantic Space for Communication via Facial Expressions. Scandinavian Journal of Psychology, 7, 1-30.
https://doi.org/10.1111/j.1467-9450.1966.tb01334.x
[16]  Plutchik, R. (1980). Emotion: A Psychoevolutionary Synthesis. Harper & Row.
[17]  Poria, S., Cambria, E., Bajpai, R., & Hussain, A. (2017). A Review of Affective Computing: From Unimodal Analysis to Multimodal Fusion. Information Fusion, 37, 98-125.
https://doi.org/10.1016/j.inffus.2017.02.003
[18]  Posner, J., Russell, J. A., & Peterson, B. S. (2015). The Circumplex Model of Affect: An Integrative Approach to Affective Neuroscience, Cognitive Development, and Psychopathology. Development and Psychopathology, 17, 715-734.
https://doi.org/10.1017/S0954579405050340
[19]  Ringeval, F., Sonderegger, A., Sauer, J., & Lalanne, D. (2013). Introducing the RECOLA Multimodal Corpus of Remote Collaborative and Affective Interactions. In 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) (pp. 1-8). IEEE.
https://doi.org/10.1109/FG.2013.6553805
[20]  Russell, J. A. (1980). A Circumplex Model of Affect. Journal of Personality and Social Psychology, 39, 1161-1178.
https://doi.org/10.1037/h0077714
[21]  Thayer, R. E. (1978). Toward a Psychological Theory of Multidimensional Activation (Arousal). Motivation and Emotion, 2, 1-34.
https://doi.org/10.1007/BF00992729
[22]  Watson, D., & Tellegen, A. (1985). Toward a Consensual Structure of Mood. Psychological Bulletin, 98, 219-235.
https://doi.org/10.1037/0033-2909.98.2.219
[23]  Zou, J. L., Zhang, X. C., Zhang, H., Yu, L., & Zhou, R. L. (2011). Beyond Dichotomy of Valence and Arousal: Review of the Motivational Dimensional Model of Affect. Advances in Psychological Science, 19, 1339-1346.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133