Aller au contenu principal
Bayesian Optimization Sequential Surrogate (BOSS) Algorithm: Fast Bayesian Inference for a Broad Class of Bayesian Hierarchical Models
Approximate Bayesian inference based on Laplace approximation and quadrature methods have become increasingly popular for their efficiency to fit latent Gaussian/extended latent Gaussian models (LGM/ELGM), which encompass various popular hierarchical models. However, many useful models belong to the LGM/ELGM frameworks only if some parameters are fixed. Such models are termed conditional LGM/ELGMs (Gomez-Rubio and Rue, 2018). Existing methods to fit conditional LGM/ELGM rely on grid search or Markov-chain Monte Carlo (MCMC) to explore the unnormalized posterior density. Such procedures become computationally prohibitive beyond simple scenarios, as each evaluation of the density requires fitting a separate LGM/ELGM. In this work, we introduce the Bayesian optimization sequential surrogate (BOSS) algorithm to reduce the computational resource for fitting conditional LGM/ELGMs. With orders of magnitude fewer evaluations compared to grid or MCMC methods, Bayesian optimization generate sequential design points that capture the majority of the posterior mass of the conditioning parameter, which subsequently yields an accurate surrogate posterior distribution that can be normalized with negligible computational cost. We illustrate the efficiency and accuracy and practical utility of the proposed method through extensive simulation studies and real-world applications.
Date and Time
-
Langue de la présentation orale
Anglais
Langue des supports visuels
Anglais

Speaker

Edit Name Primary Affiliation
Dayi Li University of Toronto