全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于多粒度的图对比学习推荐算法
Multi-Granularity Graph Contrastive Learning Recommender Algorithm

DOI: 10.12677/mos.2024.133193, PP. 2097-2110

Keywords: 推荐系统,图神经网络,图对比学习,多粒度
Recommender System
, Graph Neural Network, Graph Contrastive Learning, Multi-Granularity

Full-Text   Cite this paper   Add to My Lib

Abstract:

现有的图对比学习推荐算法,多局限于节点级或图级的对比学习,未能综合利用图的信息。针对这个问题,提出了一种基于多粒度的图对比学习推荐算法(Multi-Granularity Graph Contrastive Learning, MGGCL)。该算法通过融合节点级、子图级和全图级的对比学习,实现对用户和项目的跨粒度建模。在节点级别引入邻域对比学习方法;在子图级别采用随机游走生成不同局部结构,进行子图级对比学习;在全图级别通过采用相似子图–原图采样的组合方式,构建全图级别的对比视图。最后利用多任务策略对推荐监督任务和对比学习任务进行联合优化,提升推荐效果。在真实数据集Yelp和Amazon-Book上进行实验,采用Recall和NDCG指标进行评估。实验结果表明,提出的模型在两个数据集上相较于基线模型,Recall@20分别提升了4.24%和6.85%,NDCG@20分别提升了4.04%和9.66%。
The existing graph contrastive learning-based recommendation algorithms are often constrained to node-level or graph-level contrastive learning, limiting their ability to fully leverage the information present in the graph. To address this limitation, a Multi-Granularity Graph Contrastive Learning (MGGCL) approach was proposed for recommendation. The MGGCL approach integrates contrastive learning at multiple granularities to enable modeling of users and items across different scales. At the node level, a neighborhood contrastive learning method is introduced. At the subgraph level, diverse local structures are generated using random walks for subgraph-level contrastive learning. At the full graph level, a contrastive view of the entire graph is constructed by combining similar subgraphs with the original graph through sampling. In the end, a multi-task strategy is employed to jointly optimize the recommendation supervision task and the contrastive learning task, resulting in improved recommendation performance. Experimental evaluations conducted on real-world datasets, namely Yelp and Amazon-Book, employ the Recall and NDCG metrics for assessment. The results demonstrate significant enhancements of the proposed model compared to the baseline models, with Recall@20 improving by 4.24% and 6.85%, and NDCG@20 improving by 4.04% and 9.66% on the Yelp and Amazon-Book datasets, respectively.

References

[1]  Ricci, F., Rokach, L. and Shapira, B. (2010) Introduction to Recommender Systems Handbook. In: Ricci, F., Ed., Recommender Systems Handbook, Springer US, Boston, 1-35.
https://doi.org/10.1007/978-0-387-85820-3_1
[2]  Balabanovi?, M. and Shoham, Y. (1997) Fab: Content-Based, Collaborative Recommendation. Communication of the ACM, 40, 66-72.
https://doi.org/10.1145/245108.245124
[3]  Wang, X., He, X., Wang, M., et al. (2019) Neural Graph Collaborative Filtering. SIGIR’19: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Paris, 21-25 July 2019, 165-174.
https://doi.org/10.1145/3331184.3331267
[4]  Lee, J., Lee, I. and Kang, J. (2019) Self-Attention Graph Pooling. International Conference on Machine Learning. PMLR, Long Beach, 9-15 June 2019, 3734-3743.
[5]  Liu, W., Su, J., Chen, C., et al. (2021) Leveraging Distribution Alignment via Stein Path for Cross-Domain Cold-Start Recommendation. Advances in Neural Information Processing Systems, 34, 19223-19234.
[6]  Manessi, F. and Rozza, A. (2021) Graph-Based Neural Network Models with Multiple Self-Supervised Auxiliary Tasks. Pattern Recognition Letters, 148, 15-21.
https://doi.org/10.1016/j.patrec.2021.04.021
[7]  Hu, Z., Dong, Y., Wang, K., et al. (2020) Gpt-Gnn: Generative Pre-Training of Graph Neural Networks. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 6-10 July 2020, 1857-1867.
https://doi.org/10.1145/3394486.3403237
[8]  Zhu, Y., Xu, Y., Liu, Q., et al. (2021) An Empirical Study of Graph Contrastive Learning. 35th Conference on Neural Information Processing Systems (NeurIPS 2021), 6-14 December 2021, 1-13.
[9]  Chen, T., Kornblith, S., Norouzi, M., et al. (2020) A Simple Framework for Contrastive Learning of Visual Representations.
[10]  Gao, T., Yao, X. and Chen, D. (2021) SimCSE: Simple Contrastive Learning of Sentence Embeddings. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, November 2021, 6894-6910.
https://doi.org/10.18653/v1/2021.emnlp-main.552
[11]  Veli?kovi?, P., Fedus, W., Hamilton, W.L., et al. (2018) Deep Graph Infomax. arXiv preprint arXiv:1809.10341.
[12]  Zhu, Y., Xu, Y., Yu, F., et al. (2021) Graph Contrastive Learning with Adaptive Augmentation. Proceedings of the Web Conference, Ljubljana, 19-23 April 2021, 2069-2080.
https://doi.org/10.1145/3442381.3449802
[13]  Wu, J., Wang, X., Feng, F., et al. (2021) Self-Supervised Graph Learning for Recommendation. Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 11-15 July 2021, 726-735.
https://doi.org/10.1145/3404835.3462862
[14]  Yu, J., Yin, H., Xia, X., et al. (2021) Are Graph Augmentations Necessary? Simple Graph Contrastive Learning for Recommendation. Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Madrid, 11-15 July 2022, 1294-1303.
https://doi.org/10.1145/3477495.3531937
[15]  Lin, Z., Tian, C., Hou, Y., et al. (2022) Improving Graph Collaborative Filtering with Neighborhood-Enriched Contrastive Learning. Proceedings of the ACM Web Conference, Lyon, 25-29 April 2022, 2320-2329.
https://doi.org/10.1145/3485447.3512104
[16]  Cai, X., Huang, C., Xia, L., et al. (2023) LightGCL: Simple Yet Effective Graph Contrastive Learning for Recommendation.
[17]  Jiang, Y., Huang, C. and Huang, L. (2023) Adaptive Graph Contrastive Learning for Recommendation. Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Long Beach, 6-10 August 2023, 4252-4261.
https://doi.org/10.1145/3580305.3599768
[18]  Hajij, M., Zamzmi, G., Papamarkou, T., et al. (2022) Topological Deep Learning: Going Beyond Graph Data.
[19]  Zhu, Y., Xu, Y., Yu, F., et al. (2020) Deep Graph Contrastive Representation Learning.
[20]  Chi, H. and Ma, Y. (2022) Enhancing Graph Contrastive Learning with Node Similarity.
[21]  You, Y., Chen, T., Sui, Y., et al. (2020) Graph Contrastive Learning with Augmentation. Advances in Neural Information Processing Systems, 33, 5812-5823.
[22]  Sun, F.Y., Hoffmann, J., Verma, V., et al. (2019) Infograph: Unsupervised and Semi-Supervised Graph-Level Representation Learning via Mutual Information Maximization. 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, 26-30 April 2020, 2250-2266.
[23]  You, Y., Chen, T., Shen, Y., et al. (2021) Graph Contrastive Learning Automated. International Conference on Machine Learning. PMLR, 18-24 July 2021, 12121-12132.
[24]  He, X., Deng, K., Wang, X., et al. (2020) Lightgcn: Simplifying and Powering Graph Convolution Network for Recommendation. Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, Xi’an, 25-30 July 2020, 639-648.
https://doi.org/10.1145/3397271.3401063
[25]  Rendle, S., Freudenthaler, C., Gantner, Z., et al. (2012) BPR: Bayesian Personalized Ranking from Implicit Feedback.
[26]  Oord, A., Li, Y. and Vinyals, O. (2018) Representation Learning with Contrastive Predictive Coding.
[27]  Perozzi, B., Al-Rfou, R. and Skiena, S. (2014) Deepwalk: Online Learning of Social Representations. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 24-27 August 2014, 701-710.
https://doi.org/10.1145/2623330.2623732
[28]  Grover, A. and Leskovec, J. (2016) Node2vec: Scalable Feature Learning for Networks. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, 13-17 August 2016, 855-864.
https://doi.org/10.1145/2939672.2939754
[29]  Rendle, S., Freudenthaler, C., Gantner, Z., et al. (2014) Bayesian Personalized Ranking from Implicit Feedback. Proceedings of Uncertainty in Artificial Intelligence, Quebec, 23-27 July 2014, 452-461.
[30]  He, X., Liao, L., Zhang, H., et al. (2017) Neural Collaborative Filtering. Proceedings of the 26th International Conference on World Wide Web, Perth, 3-7 April 2017, 173-182.
https://doi.org/10.1145/3038912.3052569
[31]  Yu, J., Xia, X., Chen, T., et al. (2023) XSimGCL: Towards Extremely Simple Graph Contrastive Learning for Recommendation. IEEE Transactions on Knowledge and Data Engineering, 36, 913-926.
https://doi.org/10.1109/TKDE.2023.3288135
[32]  Kingma, D, P. and Ba, J. (2014) Adam: A Method for Stochastic Optimization.
[33]  Glorot, X. and Bengio, Y. (2010) Understanding the Difficulty of Training Deep Feed forward Neural Networks. Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, Sardinia, 13-15 May 2010, 249-256.
[34]  Wang, F. and Liu, H. (2020) Understanding the Behaviour of Contrastive Loss. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, 20-25 June 2021, 2495-2504.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133