Learning to branch with Tree-aware Branching Transformers

Knowledge-Based Systems(2022)

引用 5|浏览3
暂无评分
摘要
Machine learning techniques have attracted increasing attention in learning Branch-and-Bound (B&B) variable selection policies, but most of the existing methods lack extensions to heterogeneous problems. Though parameterizing search trees has recently shown a promising alternative for heterogeneous scenarios, it remains challenging to maintain good performance when generalizing to instances comparatively harder to solve than those seen during training. To fill this gap, we propose a tree-aware transformer-based branching framework for branching efficiently and effectively. Specifically, the transformer-based branching is conducted, in which the mutual connections between candidate variables can be evaluated by the self-attention mechanism. Then, we novelly encode the empirical data in the search tree, i.e., branching history, with a binary tree representation. In this way, we can fully utilize features exploited from the parameterized B&B search trees and stronger branching policies can be attained thereby. The proposed models are evaluated on multiple benchmark instances and achieve a significant boost on performance, in terms of smaller B&B search trees and lower primal–dual integrals and gaps for harder problems within a given time limit. Ablation studies further validate the effectiveness of our method.
更多
查看译文
关键词
Branch and Bound,Machine learning,Branching strategies,Mixed Integer Linear Programming
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要