Incremental Learning For Matrix Factorization In Recommender Systems

2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA)(2016)

引用 35|浏览88
暂无评分
摘要
Recommender systems play a key role in personalizing service experiences by recommending relevant items to users. One popular technique for producing such personalization at scale is collaborative filtering via Matrix Factorization (MF). The essence of MF is to train a model by factorizing a sparse rating matrix consisting of users' ratings of item. Unfortunately, existing MF methods require model Learning from Scratch when new data (for users, items, or user ratings) arrive. Learning large models from scratch incurs significant computation cost and typically also results in stale recommendations. With increasing amounts of data and a need for real-time recommendations, incremental learning is desirable. In this paper, we develop a novel but simple method for incremental learning of MF models, called One-sided Least Squares, and demonstrate its parallel implementation via Apache Spark. We also describe how to integrate it with batch learning via Alternating Least Squares (ALS). Unlike previous incremental learning methods, we study our method's approximation of the results of ALS, while significantly reducing compute and storage costs. Our theoretical analysis and experimental results on three realworld datasets suggest that One-sided Least Squares achieves prediction accuracy close to Learning from Scratch with ALS at substantially faster learning speeds. This fast and accurate method for incremental learning enables improved Web-scale recommender systems.
更多
查看译文
关键词
Incremental Learning, Least Squares, Matrix Factorization, Recommender Systems, Big Data, Spark
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要