A Principled Two-Step Method for Example-Dependent Cost Binary Classification.

Lecture Notes in Computer Science(2019)

引用 3|浏览1
暂无评分
摘要
This paper presents a principled two-step method for example-dependent cost binary classification problems. The first step obtains a consistent estimate of the posterior probabilities by training a Multi-Layer Perceptron with a Bregman surrogate cost. The second step uses the provided estimates in a Bayesian decision rule. When working with imbalanced datasets, neutral re-balancing allows getting better estimates of the posterior probabilities. Experiments with real datasets show the good performance of the proposed method in comparison with other procedures.
更多
查看译文
关键词
Bregman divergences,Classification,Example-dependent cost,Imbalanced data,Neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要