A time-aware self-attention based neural network model for sequential recommendation

Applied Soft Computing(2022)

引用 9|浏览16
暂无评分
摘要
Sequential recommendation is one of the hot research topics in recent years. Various sequential recommendation models have been proposed, of which Self-Attention (SA)-based models are shown to have state-of-the-art performance. However, most of the existing SA-based sequential recommendation models do not make use of temporal information, i.e., timestamps of user–item interactions, except for an initial attempt (Li et al., 2020). In this paper, we propose a Time-Aware Transformer for Sequential Recommendation (TAT4SRec), an SA-based neural network model which utilizes the temporal information and captures users’ preferences more precisely. TAT4SRec has two salient features: (1) TAT4SRec utilizes an encoder–decoder structure to model timestamps and interacted items separately and this structure appears to be a better way of making use of the temporal information. (2) in the proposed TAT4SRec, two different embedding modules are designed to transform continuous data (timestamps) and discrete data (item IDs) into embedding matrices respectively. Specifically, we propose a window function-based embedding module to preserve the continuous dependency contained in similar timestamps. Finally, extensive experiments demonstrate the effectiveness of the proposed TAT4SRec over various state-of-the-art MC/RNN/SA-based sequential recommendation models under several widely-used metrics. Furthermore, experiments are also performed to show the rationality of the different proposed structures and demonstrate the computation efficiency of TAT4SRec. The promising experimental results make it possible to apply TAT4SRec in various online applications.
更多
查看译文
关键词
Neural recommender systems,Sequential recommendation,Self-attention,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要