Learning Mixtures of Random Utility Models with Features from Incomplete Preferences

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence(2022)

引用 3|浏览0
暂无评分
摘要
Random Utility Models (RUMs), which subsume Plackett-Luce model (PL) as a special case, are among the most popular models for preference learning. In this paper, we consider RUMs with features and their mixtures, where each alternative has a vector of features, possibly different across agents. Such models significantly generalize the standard PL and RUMs, but are not as well investigated in the literature. We extend mixtures of RUMs with features to models that generate incomplete preferences and characterize their identifiability. For PL, we prove that when PL with features is identifiable, its MLE is consistent with a strictly concave objective function under mild assumptions, by characterizing a bound on root-mean-square-error (RMSE), which naturally leads to a sample complexity bound. We also characterize identifiability of more general RUMs with features and propose a generalized RBCML to learn them. Our experiments on synthetic data demonstrate the effectiveness of MLE on PL with features with tradeoffs between statistical efficiency and computational efficiency. Our experiments on real-world data show the prediction power of PL with features and its mixtures.
更多
查看译文
关键词
random utility models,mixtures,incomplete,features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要