Bayesian optimization with informative parametric models via sequential Monte Carlo

Rafael Oliveira, Richard Scalzo, Robert Kohn, Sally Cripps, Kyle Hardman, John Close, Nasrin Taghavi, Charles Lemckert

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
53 Downloads (Pure)

Abstract

Bayesian optimization (BO) has been a successful approach to optimize expensive functions whose prior knowledge can be specified by means of a probabilistic model. Due to their expressiveness and tractable closed-form predictive distributions, Gaussian process (GP) surrogate models have been the default go-to choice when deriving BO frameworks. However, as nonparametric models, GPs offer very little in terms of interpretability and informative power when applied to model complex physical phenomena in scientific applications. In addition, the Gaussian assumption also limits the applicability of GPs to problems where the variables of interest may highly deviate from Gaussianity. In this article, we investigate an alternative modeling framework for BO which makes use of sequential Monte Carlo (SMC) to perform Bayesian inference with parametric models. We propose a BO algorithm to take advantage of SMC's flexible posterior representations and provide methods to compensate for bias in the approximations and reduce particle degeneracy. Experimental results on simulated engineering applications in detecting water leaks and contaminant source localization are presented showing performance improvements over GP-based BO approaches.

Original languageEnglish
Article numbere5
Pages (from-to)1-19
Number of pages19
JournalData-Centric Engineering
Volume3
Issue number1
DOIs
Publication statusPublished - 8 Mar 2022

Fingerprint

Dive into the research topics of 'Bayesian optimization with informative parametric models via sequential Monte Carlo'. Together they form a unique fingerprint.

Cite this