Searching Search Spaces: Meta-evolving a Geometric Encoding for Neural Networks
arxiv(2024)
摘要
In evolutionary policy search, neural networks are usually represented using
a direct mapping: each gene encodes one network weight. Indirect encoding
methods, where each gene can encode for multiple weights, shorten the genome to
reduce the dimensions of the search space and better exploit permutations and
symmetries. The Geometric Encoding for Neural network Evolution (GENE)
introduced an indirect encoding where the weight of a connection is computed as
the (pseudo-)distance between the two linked neurons, leading to a genome size
growing linearly with the number of genes instead of quadratically in direct
encoding. However GENE still relies on hand-crafted distance functions with no
prior optimization. Here we show that better performing distance functions can
be found for GENE using Cartesian Genetic Programming (CGP) in a meta-evolution
approach, hence optimizing the encoding to create a search space that is easier
to exploit. We show that GENE with a learned function can outperform both
direct encoding and the hand-crafted distances, generalizing on unseen
problems, and we study how the encoding impacts neural network properties.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要