Program synthesis as latent continuous optimization: evolutionary search in neural embeddings

Genetic and Evolutionary Computation Conference(2020)

引用 6|浏览8
暂无评分
摘要
ABSTRACTIn optimization and machine learning, the divide between discrete and continuous problems and methods is deep and persistent. We attempt to remove this distinction by training neural network autoencoders that embed discrete candidate solutions in continuous latent spaces. This allows us to take advantage of state-of-the-art continuous optimization methods for solving discrete optimization problems, and mitigates certain challenges in discrete optimization, such as design of bias-free search operators. In the experimental part, we consider program synthesis as the special case of combinatorial optimization. We train an autoencoder network on a large sample of programs in a problem-agnostic, unsupervised manner, and then use it with an evolutionary continuous optimization algorithm (CMA-ES) to map the points from the latent space to programs. We propose also a variant in which semantically similar programs are more likely to have similar embeddings. Assessment on a range of benchmarks in two domains indicates the viability of this approach and the usefulness of involving program semantics.
更多
查看译文
关键词
Program Synthesis, Genetic Programming, Deep Learning, Embedding, Autoencoders
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要