A conjugated evolutionary algorithm for hyperparameter optimization

2022 IEEE Congress on Evolutionary Computation (CEC)(2022)

引用 1|浏览0
暂无评分
摘要
With the recent upsurge in the use of deep learning and other computationally expensive machine learning models, hyperparameter optimization has become a quite important and widely researched area of study. Genetic algorithms, a subclass of evolutionary algorithms, have proven to be an effective approach and have been widely used in recent years. However, efficiently exploring the domain of possible solutions remains a challenging, and often computationally-expensive task. In this paper, we present a novel and efficient hyperparameter optimization strategy based on a genetic algorithms variant: Biased Random-key Genetic Algorithms (BRKGA). One of the main challenges of BRKGA is its limited capacity to explore the domain surrounding a particular individual. Although good genes will be preserved by its bias property, these genes are copied as they are, and even if a better solution exists in the close neighborhood of a particular gene it might never be explored. We tackle this problem by adding an exploitation component at the end of every evolutionary step, further exploring the hyperparameter domain. Several computational experiments on eight different publicly available datasets were performed to assess the effectiveness of the proposed approach and to prove it is a significant improvement over its predecessor. The results show that our proposed method outperforms, in terms of the $F_{1}$ score of the resulting Artificial Neural Network, not only BRKGA but also other commonly used methods in most of the test cases.
更多
查看译文
关键词
optimization,hyperparameter optimization,genetic algorithms,evolutionary algorithms,metaheuristic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要