Hyper-parameter Tuning using Genetic Algorithms for Software Effort Estimation

PROCEEDINGS OF 2021 16TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI'2021)(2021)

引用 0|浏览1
暂无评分
摘要
Research on software effort estimation has investigated the impact of hyper-parameter tuning approaches for machine learning. Many automated tuning approaches have been proposed in the literature to improve the performance of their base models. In this study we compare the Dodge, Grid, Harmony, Tabu, and Random Search algorithms against Standard Genetic, 1+1 Genetic, and Compact Genetic Algorithms in terms of their impact to model accuracy and stability. This evaluation was performed using the ISBSG R18 dataset and on the Support Vector Regression, Classification and Regression Trees, and Ridge Regression techniques. Results of the Scott-Knott analysis show that the genetic algorithms perform similar or better to the other tuners, achieving an improvement of standardized accuracy of up to 0,21 and final values of up to 0,53. Genetic algorithm-tuned Support Vector Regression achieves the highest accuracy while improving estimation stability relative to other tuners.
更多
查看译文
关键词
optimization,hyper-parameters,evolutive algorithm,machine learning,empirical study
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要