A General Framework for Constrained Bayesian Optimization using Information-based Search

Journal of Machine Learning Research, Volume 17, 2015.

Cited by: 66|Bibtex|Views60|Links
real worldRejection Sampling with a Dynamic GridGaussian processrejection samplingnon-competitive decouplingMore(21+)
We have introduced the notions of competitive decoupling, where two or more tasks compete for the same resource, and non-competitive decoupling, where the tasks require to use different resources and can be evaluated in parallel


We present an information-theoretic framework for solving global black-box optimization problems that also have black-box constraints. Of particular interest to us is to efficiently solve problems with decoupled constraints, in which subsets of the objective and constraint functions may be evaluated independently. For example, when the ...More



  • Many real-world optimization problems involve finding a global minimizer of a black-box objective function subject to a set of black-box constraints all being simultaneously satisfied.
  • Because they are based on EI, computing their acquisition function requires the current best feasible solution or incumbent: a location in the search space with low expected objective value and high probability of satisfying the constraints.
  • The resulting technique is called Predictive Entropy Search with Constraints (PESC) and its acquisition function approximates the expected information gain with regard to the solution of Eq (1), which we call x .
  • Gelbart et al (2014) consider extending the EIC method from Section 2.1 to the decoupled setting, where the different functions can be independently evaluated at different input locations.
  • Gelbart et al (2014) circumvent the problem mentioned above by treating decoupling as a special case and using a two-stage acquisition function: first, the evaluation location x is chosen with EIC, and given x, the task is chosen according to the expected reduction in the entropy of the global feasible minimizer x , where the entropy computation is approximated using Monte Carlo sampling as proposed by Villemonteix et al (2009).
  • Our acquisition function approximates the expected gain of information about the solution to the constrained optimization problem, which is denoted by x .
  • Eq (8) is used by PESC to efficiently solve constrained Bayesian optimization problems with decoupled function evaluations.
  • We show how PESC can be used to obtain the task-specific acquisition functions required by the general algorithm from Section 3.3.
  • We follow Snoek et al (2012) and average the PESC acquisition function with respect to the generated hyper-parameter samples.
  • We follow the method proposed by Hernandez-Lobato et al (2014) to average the acquisition function of Predictive Entropy Search in the unconstrained case.
  • An example in the unconstrained setting is Entropy Search (Hennig and Schuler, 2012), which requires re-computing an approximation to the acquisition function for each hyper-parameter sample Θj.
  • The complexity of PESC is O(M KN 3), where M is the number of posterior samples of the global constrained minimizer x , K is the number of constraints, and N is the number of collected data points.
  • PESC approximates the expected reduction in the posterior entropy of x with the acquisition function given by Eq (12).
  • 5. As described in the last paragraph of Section 4.2, in the execution of EP we separate the computations that depend on D and xj , which are very expensive, from those that depend on the location x at which the PESC acquisition function will be evaluated.
  • An acquisition function that satisfies this requirement is said to be separable
  • The results of our experiments show that PESC achieves state-of-the-art results in this scenario
Your rating :