Random feature weights for regression trees

Progress in Artificial Intelligence(2016)

引用 4|浏览27
暂无评分
摘要
Ensembles are learning methods the operation of which relies on a combination of different base models. The diversity of ensembles is a fundamental aspect that conditions their operation. Random Feature Weights ( ℛℱ𝒲 ) was proposed as a classification-tree ensemble construction method in which diversity is introduced into each tree by means of a random weight associated with each attribute. These weights vary from one tree to another in the ensemble. In this article, the idea of ℛℱ𝒲 is adapted to decision-tree regression. A comparison is drawn with other ensemble construction methods: Bagging, Random Forest, Iterated Bagging, Random Subspaces and AdaBoost.R2 obtaining competitive results.
更多
查看译文
关键词
Regression trees,Ensembles,Bagging,Decision trees,Random feature weights
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要