High Dimensional Bayesian Optimisation via Low Dimensional Embedded Subspaces
Speaker: Tran The Hung

Time: 9h30, Thursday, February 27, 2020.
Location: Room 612, Building A6, Institute of Mathematics, 18 Hoang Quoc Viet.

Abstract: Abstract: Scaling Bayesian optimisation (BO) to high-dimensional search spaces is a active and open research problems particularly when no assumptions are made on function structure. We propose a novel approach where the acquisition function only requires maximisation on a discrete set of low dimensional subspaces embedded in the original high-dimensional search space. We show that in spite of this convenience, our algorithm remains convergent. In particular, cumulative regret of our algorithm only grows sub-linearly with the number of iterations.  More importantly, as evident from our regret bounds, our algorithm provides a way to trade the convergence rate with the number of subspaces used in the optimisation. Finally, when the number of subspaces is "sufficiently large", our algorithm's cumulative regret is at most  $mathcal{O}^{*}(sqrt{Tgamma_T})$ as opposed to $mathcal{O}^{*}(sqrt{DTgamma_T})$ for the GP-UCB of Srinivas et al. (2012), reducing a crucial factor $sqrt{D}$ where $D$ being the dimensional number of input space. We perform empirical experiments to evaluate our method extensively, showing that its sample efficiency is better than the existing methods for many optimisation problems involving dimensions up to 5000.

Link bài báo: https://arxiv.org/abs/1911.11950

Back