One-step Bayesian example-dependent cost classification: The OsC-MLP method

SSRN Electronic Journal(2024)

引用 0|浏览0
暂无评分
摘要
Example-dependent cost classification problems are those where the decision costs depend not only on the true and the attributed classes but also on the sample features. Discriminative algorithms that carry out such classification tasks must take this dependence into account. In some applications, the decision costs are known for the training set but not in production, which complicates the problem.In this paper, we introduce a new one-step Bayesian formulation to train Neural Networks and solve the above limitation for binary cases with one-step Learning Machines, avoiding the drawbacks that unknown analytical forms of the example-dependent costs create. The formulation is based on defining an artificial likelihood ratio by using the available training classification costs in its definition, and proposes a test that does not require the values of the costs for unseen samples. Furthermore, it also includes Bayesian rebalancing mechanisms to combat the negative effects of class imbalance. Experimental results support the consistency and effectiveness of the corresponding algorithms.
更多
查看译文
关键词
Imbalance,Bregman divergences,Neural networks,Informed re-balancing,Sample emphasis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要