Intersectional Two-sided Fairness in Recommendation
CoRR(2024)
摘要
Fairness of recommender systems (RS) has attracted increasing attention
recently. Based on the involved stakeholders, the fairness of RS can be divided
into user fairness, item fairness, and two-sided fairness which considers both
user and item fairness simultaneously. However, we argue that the
intersectional two-sided unfairness may still exist even if the RS is two-sided
fair, which is observed and shown by empirical studies on real-world data in
this paper, and has not been well-studied previously. To mitigate this problem,
we propose a novel approach called Intersectional Two-sided Fairness
Recommendation (ITFR). Our method utilizes a sharpness-aware loss to perceive
disadvantaged groups, and then uses collaborative loss balance to develop
consistent distinguishing abilities for different intersectional groups.
Additionally, predicted score normalization is leveraged to align positive
predicted scores to fairly treat positives in different intersectional groups.
Extensive experiments and analyses on three public datasets show that our
proposed approach effectively alleviates the intersectional two-sided
unfairness and consistently outperforms previous state-of-the-art methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要