Optformer: Beyond Transformer for Black-box Optimization

ICLR 2023(2023)

引用 0|浏览12
暂无评分
摘要
We design a novel Transformer for continuous unconstrained black-box optimization, called Optformer. Inspired by the similarity between Vision Transformer and evolutionary algorithms (EAs), we modify Tansformer's multi-head self-attention layer, feed-forward network, and residual connection to implement the functions of crossover, mutation, and selection operators. Moreover, we devise an iterated mode to generate and survive potential solutions like EAs. Optformer establishes a mapping from the random population to the optimal population. Compared to baselines, such as EAs, Bayesian optimization, and the learning-to-optimize method, Optformer shows the top performance in six black-box functions and one real-world application. We also find that untrained Optformer can also achieve good performance.
更多
查看译文
关键词
Transformer,Black-box optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要