CobBO: Coordinate Backoff Bayesian Optimization with Two-Stage Kernels

FOS: Computer and information sciences Computer Science - Machine Learning I.2.8; I.2.6; G.3 Computer Science - Artificial Intelligence I.2.6 I.2.8 G.3 Machine Learning (stat.ML) 02 engineering and technology Machine Learning (cs.LG) 62C12, 68T20, 68W27 Artificial Intelligence (cs.AI) Statistics - Machine Learning Optimization and Control (math.OC) 0202 electrical engineering, electronic engineering, information engineering FOS: Mathematics Mathematics - Optimization and Control
DOI: 10.48550/arxiv.2101.05147 Publication Date: 2021-01-01
ABSTRACT
Bayesian optimization is a popular method for optimizing expensive black-box functions. Yet it oftentimes struggles in high dimensions where the computation could be prohibitively heavy. To alleviate this problem, we introduce Coordinate backoff Bayesian Optimization (CobBO) with two-stage kernels. During each round, the first stage uses a simple coarse kernel that sacrifices the approximation accuracy for computational efficiency. It captures the global landscape by purposely smoothing away local fluctuations. Then, in the second stage of the same round, past observed points in the full space are projected to the selected subspace to form virtual points. These virtual points, along with the means and variances of their unknown function values estimated using the simple kernel of the first stage, are fitted to a more sophisticated kernel model in the second stage. Within the selected low dimensional subspace, the computational cost of conducting Bayesian optimization therein becomes affordable. To further enhance the performance, a sequence of consecutive observations in the same subspace are collected, which can effectively refine the approximation of the function. This refinement lasts until a stopping rule is met determining when to back off from a certain subspace and switch to another. This decoupling significantly reduces the computational burden in high dimensions, which fully leverages the observations in the whole space rather than only relying on observations in each coordinate subspace. Extensive evaluations show that CobBO finds solutions comparable to or better than other state-of-the-art methods for dimensions ranging from tens to hundreds, while reducing both the trial complexity and computational costs.<br/>Jian Tan and Niv Nayman contributed equally. An implementation of CobBO is available at: https://github.com/Alibaba-MIIL/CobBO<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....