On the Impact of Tool Evolution and Case Study Size on SBSE Experiments: A Replicated Study with EvoMaster

SEARCH-BASED SOFTWARE ENGINEERING, SSBSE 2023(2024)

引用 0|浏览5
暂无评分
摘要
In the dynamic landscape of Search-Based Software Engineering (SBSE), tools and algorithms are continually improved, possibly making past experimental insights outdated. This could happen if a newly designed technique has side-effects compared to techniques and parameters settings studied in previous work. Re-tuning all possible parameters in a SBSE tool at each new scientific study would not be viable, as too expensive and too time consuming, considering there could be hundreds of them. In this paper, we carried out a series of experiments to study the impact that such re-tuning could have. For such a study, we chose the SBSE tool EvoMaster. It is an open-source tool for automated test generation for REST APIs. It has been actively developed for over six years, since November 2016, making it an appropriate choice for this kind of studies. In these experiments, we replicated four previous studies of EvoMaster with 15 REST APIs as case studies, using its latest version. Our findings reveal that updated parameter settings can offer improved performance, underscoring the possible benefits of re-tuning already existing parameters. Additionally, the inclusion of a broader range of case studies provides support for the replicated study's outcomes compared to the original studies, enhancing their external validity.
更多
查看译文
关键词
White-Box Test Generation,SBST,RESTful APIs,Parameter Tuning,Replicating Studies
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要