MOOforest - Multi-objective Optimization to Form Decision Tree Ensemble.

PCC (2)(2023)

引用 0|浏览4
暂无评分
摘要
Multi-criteria optimization is increasingly used to build classifier ensembles, including for the imbalanced data classification task. Then we have the problem of optimizing at least two criteria related to the prediction quality of the minority and majority classes, or the so-called classification precision. The paper proposes MOOforest - a new method for building decision tree ensembles. It uses the MOEA/D optimization algorithm to return a diverse pool of base classifiers by selecting different feature subsets on which they are trained. From the pool of non-dominated solutions, the final ensemble is chosen using the promethee method. Modifying the weights of the promethee algorithm allows the user to select the appropriate solution in the context of the user’s expectations (i.e., it indicates how important each optimization criterion is to the user). It is worth noting that during the classifier ensemble training, the features selected for the base classifiers result from the optimization process, and not as in the case of popular algorithms employing the Random Subspace approach (such as Random Forest), where attributes are selected randomly. Thus, the proposed method has an advantage over the mentioned approach, confirmed through a comprehensive set of computer experiments.
更多
查看译文
关键词
ensemble,optimization,decision,multi-objective
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要