KDCRec: Knowledge Distillation for Counterfactual Recommendation via Uniform Data

IEEE Transactions on Knowledge and Data Engineering(2023)

引用 2|浏览31
暂无评分
摘要
The bias problems in recommender systems are an important challenge. In this paper, we focus on solving the bias problems via uniform data. Previous works have shown that simple modeling with a uniform data can alleviate the bias problems and improve the performance. However, the uniform data is usually few and expensive to collect in a real product. In order to use the valuable uniform data more effectively, we propose a novel and general knowledge distillation framework for counterfactual recommendation with four specific methods, including label-based distillation, feature-based distillation, sample-based distillation and model structure-based distillation. Moreover, we discuss the relation between the proposed framework and the previous works. We then conduct extensive experiments on both public and product datasets to verify the effectiveness of the proposed four methods. In addition, we explore and analyze the performance trends of the proposed methods on some key factors, and the changes in the distribution of the recommendation lists. Finally, we emphasize that counterfactual modeling with uniform data is a rich research area, and list some interesting and promising research topics worthy of further exploration. Note that the source codes are available at https://github.com/dgliu/TKDE_KDCRec .
更多
查看译文
关键词
Counterfactual, bias, recommender systems, knowledge distillation, uniform data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要