Personalized Representation With Contrastive Loss for Recommendation Systems.

IEEE Trans. Multim.(2024)

引用 0|浏览0
暂无评分
摘要
Sequential recommendation mines the user's interaction sequence or time information to get better recommendations and thus is gaining more and more attention. Existing sequential recommendations tend to build new models, and the study of the loss function is seriously neglected. Despite the increasing attention paid to contrastive learning recently, we believe that the key to contrastive learning is contrastive loss(CL), which also provides a new option for sequential recommendation. However, we find it works against the personalized representation of features. First, it is a relative constraint that keeps positive and negative samples away from each other but without an absolute constraint. Second, recent studies have shown that all embeddings should be uniformly distributed. However, CL only widens the distance of positive and negative samples within the training batch, rather than making a uniform distribution of all items. These two shortcomings make the embedding space too compact, which is harmful to personalized representation and recommendation. Therefore, this paper proposes Personalized Contrastive Loss (PCL) to combine CL with absolute constraints of BCE/CE and employs regularization methods to make the representations uniformly distributed. State-of-the-art results are obtained in experiments on several commonly used datasets. The code and data will be available on GitHub.
更多
查看译文
关键词
personalization,contrastive loss,sequential recommendation,uniformity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要