Forming Ensembles of Soft One-Class Classifiers with Weighted Bagging

New Generation Computing(2015)

引用 8|浏览5
暂无评分
摘要
For many real-life problems obtaining representative examples from a given class is relatively easy, while for the remaining ones are difficult, or even impossible. However, we would still like to construct a pattern classifier that could distinguish between the known and unknown cases. In such cases we are dealing with one-class classification, or learning in the absence of counterexamples. Such recognition systems must display a high robustness to new, unseen objects that may belong to an unknown class. That is why ensemble learning has become an attractive perspective in this field. In our work, we propose a novel one-class ensemble classifier, based on weighted Bagging. Wagging method is used to obtain randomized weights and utilize them directly in the process of training Weighted One-Class Support Vector Machines. This introduces a diversity into the pool of one-class classifiers and extends the competence of formed ensemble. Additionally, to discard similar or weak classifiers we propose to add a clustering-based pruning procedure to our ensemble. It works on the basis of similarities between weights used by each base model and detecting groups of similar predictors. This allows us to reduce the number of classifiers in the pool by selecting a single representative for each cluster. Experimental analysis, carried out on a number of benchmarks and backed-up with statistical analysis proves that the proposed method can outperform state-of-the-art ensembles dedicated to one-class classification.
更多
查看译文
关键词
Classifier Ensemble,One-Class Classification,Bagging,Wagging,Ensemble Pruning,Soft Classifier
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要