Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Wei, Taoa | Yang, Changchunb; * | Zheng, Yanqia | Zhang, Jingxuea
Affiliations: [a] School of Computer Science and Artificial Intelligence, Aliyun School of Big Data, School of Software, Changzhou University, Changzhou, Jiangsu, China | [b] School of Microelectronics and Control Engineering, Changzhou University, Changzhou, Jiangsu, China
Correspondence: [*] Corresponding author. Changchun Yang, School of Microelectronics and Control Engineering, Changzhou University, Changzhou, Jiangsu, China. E-mail: ycc@cczu.edu.cn.
Abstract: Recently, Graph Neural Networks (GNNs) using aggregating neighborhood collaborative information have shown effectiveness in recommendation. However, GNNs-based models suffer from over-smoothing and data sparsity problems. Due to its self-supervised nature, contrastive learning has gained considerable attention in the field of recommendation, aiming at alleviating highly sparse data. Graph contrastive learning models are widely used to learn the consistency of representations by constructing different graph augmentation views. Most current graph augmentation with random perturbation destroy the original graph structure information, which mislead embeddings learning. In this paper, an effective graph contrastive learning paradigm CollaGCL is proposed, which constructs graph augmentation by using singular value decomposition to preserve crucial structure information. CollaGCL enables perturbed views to effectively capture global collaborative information, mitigating the negative impact of graph structural perturbations. To optimize the contrastive learning task, the extracted meta-knowledge was propagate throughout the original graph to learn reliable embedding representations. The self-information learning between views enhances the semantic information of nodes, thus alleviating the problem of over-smoothing. Experimental results on three real-world datasets demonstrate the significant improvement of CollaGCL over state-of-the-art methods.
Keywords: Self-supervised learning, recommendation, contrastive learning, data augmentation
DOI: 10.3233/JIFS-236497
Journal: Journal of Intelligent & Fuzzy Systems, vol. Pre-press, no. Pre-press, pp. 1-14, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl