全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于图卷积网络的在线商品评论情感分析
Sentiment Analysis of Online Product Reviews Based on Text Graph Convolutional Networks

DOI: 10.12677/sa.2024.133058, PP. 578-587

Keywords: 预训练模型,图卷积网络,情感分析
Pre-Training Model
, Text Graph Convolutional Network, Sentiment Analysis

Full-Text   Cite this paper   Add to My Lib

Abstract:

现有的文本情感分析模型存在不能充分提取在线商品评论语义、全局特征信息导致分类准确率不高问题。预训练模型有较强的语义理解能力,但是缺少全局特征信息,而文本图卷积网络可以整合文本依赖信息和全局特征信息,提高文本的表示能力。针对以上问题,提出了一种新的情感分析模型ABGCN。首先使用ALBERT轻量级预训练模型进行词向量化并作为所构造文本图节点向量,其次输入到文本图卷积网络中联合训练,进行迭代更新得到在线商品评论特征信息,解决了传统模型不能充分理解语义信息和全局结构信息的问题。最后发送给Softmax分类器进行情感分类。通过对比实验,验证了提出的模型方法具有较高性能。
The existing text sentiment analysis model cannot fully extract the semantic and global feature information of online commodity reviews, resulting in low classification accuracy. Pre-training models possess strong semantic understanding capabilities, but lack global feature information. However, the text dependency information and global feature information can be integrated by the text graph convolutional network, and the text representation ability is improved. To solve these problems, a new sentiment analysis model called ABGCN is proposed. Initially, the lightweight Pre-training model ALBERT is employed to perform word vectorization, generating node vectors for the constructed text graph. Subsequently, these embeddings are fed into the text graph convolutional network for joint training, enabling iterative updates to extract feature information from online product reviews, the problem of traditional models not fully understanding semantic information and global structural information has been resolved. Ultimately, the obtained features are forwarded to a softmax classifier for sentiment classification. Comparative experiments demonstrate the superior performance of the proposed model approach.

References

[1]  Phan, H.T., Nguyen, N.T., Tran, V.C., et al. (2021) An Approach for a Decision Making Support System Based on Measuring the User Satisfaction Level on Twitter. Information Sciences, 561, 243-273.
https://doi.org/10.1016/j.ins.2021.01.008
[2]  张德阳, 韩益亮, 李晓龙. 基于主题的关键词提取对微博情感倾向的研究[J]. 燕山大学学报, 2018, 42(6): 552-560.
[3]  Devlin, J., Chang, M., Lee, K., et al. (2019) BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Minneapolis, Minnesota, 2-7 June 2019, 4171-4186.
[4]  Lan, Z.Z., Chen, M., Goodman, S., et al. (2020) ALBERT: A Lite BERT for Self-Supervised Learning of Language Representations. Proceedings of the International Conference on Learning Representations, Addis Ababa, Ethiopia, 3-4 March 2020, 1-17.
[5]  Bu, K., Liu, Y.C. and Ju, X.L. (2023) Efficient Utilization of Pre-Trained Models: A Review of Sentiment Analysis via Prompt Learning. Knowledge Based Systems, 283, 111148-111165.
https://doi.org/10.1016/j.knosys.2023.111148
[6]  Zhang, L., Wang, S. and Liu, B. (2018) Deep Learning for Sentiment Analysis: A Survey. Wiley Inter Disciplinary Reviews Data Mining and Knowledge Discovery, 8(4), 1-25.
https://doi.org/10.1002/widm.1253
[7]  Yao, L., Mao, C.S. and Luo, Y. (2019) Graph Convolutional Networks for Text Classification. Proceedings of the Association for the Advancement of Artificial Intelligence Conference on Artificial Intelligence, 33, 7370-7377.
https://doi.org/10.1609/aaai.v33i01.33017370
[8]  Lin, Y.X., Meng, Y.X. and Sun, X.F. (2021) BertGCN: Transductive Text Classification by Combining GCN and BERT. In: Zong, C.Q., Xia, F., Li, W.J. and Navigli, R., Eds., Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, Association for Computational Linguistics, Bangkok, 1456-1462.
https://doi.org/10.18653/v1/2021.findings-acl.126
[9]  张楠, 苏南, 王贵阳, 等. 深度学习之自然语言实战[M]. 北京: 机械工业出版社, 2020: 178-180.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133