Uncovering Selective State Space Model's Capabilities in Lifelong Sequential Recommendation
arxiv(2024)
摘要
Sequential Recommenders have been widely applied in various online services,
aiming to model users' dynamic interests from their sequential interactions.
With users increasingly engaging with online platforms, vast amounts of
lifelong user behavioral sequences have been generated. However, existing
sequential recommender models often struggle to handle such lifelong sequences.
The primary challenges stem from computational complexity and the ability to
capture long-range dependencies within the sequence. Recently, a state space
model featuring a selective mechanism (i.e., Mamba) has emerged. In this work,
we investigate the performance of Mamba for lifelong sequential recommendation
(i.e., length>=2k). More specifically, we leverage the Mamba block to model
lifelong user sequences selectively. We conduct extensive experiments to
evaluate the performance of representative sequential recommendation models in
the setting of lifelong sequences. Experiments on two real-world datasets
demonstrate the superiority of Mamba. We found that RecMamba achieves
performance comparable to the representative model while significantly reducing
training duration by approximately 70
are available at .
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要