On Loss Functions and Regret Bounds for Multi-Category Classification

IEEE Transactions on Information Theory(2022)

引用 7|浏览11
暂无评分
摘要
We develop new approaches in multi-class settings for constructing loss functions and establishing corresponding regret bounds with respect to the zero-one or cost-weighted classification loss. We provide new general representations of losses by deriving inverse mappings from a concave generalized entropy to a loss through a convex dissimilarity function related to the multi-distribution $f$ -divergence. This approach is then applied to study both hinge-like losses and proper scoring rules. In the first case, we derive new hinge-like convex losses, which are tighter extensions outside the probability simplex than related hinge-like losses and geometrically simpler with fewer non-differentiable edges. We also establish a classification regret bound in general for all losses with the same generalized entropy as the zero-one loss, thereby substantially extending and improving existing results. In the second case, we identify new sets of multi-class proper scoring rules through different types of dissimilarity functions and reveal interesting relationships between various composite losses currently in use. We also establish new classification regret bounds in general for multi-class proper scoring rules and, as applications, provide simple meaningful regret bounds for two specific sets of proper scoring rules. These results generalize, for the first time, previous two-class regret bounds to multi-class settings.
更多
查看译文
关键词
Boosting,Bregman divergence,composite loss,exponential loss,f-divergence,generalized entropy,hinge loss,proper scoring rule,surrogate regret bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要