Fast greedy 𝒞 -bound minimization with guarantees

Machine Learning(2020)

引用 0|浏览1
暂无评分
摘要
The 𝒞 -bound is a tight bound on the true risk of a majority vote classifier that relies on the individual quality and pairwise disagreement of the voters and provides PAC-Bayesian generalization guarantees. Based on this bound, MinCq is a classification algorithm that returns a dense distribution on a finite set of voters by minimizing it. Introduced later and inspired by boosting, CqBoost uses a column generation approach to build a sparse 𝒞 -bound optimal distribution on a possibly infinite set of voters. However, both approaches have a high computational learning time because they minimize the 𝒞 -bound by solving a quadratic program. Yet, one advantage of CqBoost is its experimental ability to provide sparse solutions. In this work, we address the problem of accelerating the 𝒞 -bound minimization process while keeping the sparsity of the solution and without losing accuracy. We present CB-Boost, a computationally efficient classification algorithm relying on a greedy–boosting-based– 𝒞 -bound optimization. An in-depth analysis proves the optimality of the greedy minimization process and quantifies the decrease of the 𝒞 -bound operated by the algorithm. Generalization guarantees are then drawn based on already existing PAC-Bayesian theorems. In addition, we experimentally evaluate the relevance of CB-Boost in terms of the three main properties we expect about it: accuracy, sparsity, and computational efficiency compared to MinCq, CqBoost, Adaboost and other ensemble methods. As observed in these experiments, CB-Boost not only achieves results comparable to the state of the art, but also provides 𝒞 -bound sub-optimal weights with very few computational demand while keeping the sparsity property of CqBoost.
更多
查看译文
关键词
PAC-Bayes,Boosting,Ensemble methods,𝒞 -bound,Greedy optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要