Mixture of expert posterior surrogates for approximate Bayesian computation

Abstract

A key ingredient in approximate Bayesian computation (ABC) procedures is the choice of a discrepancy that describes how different the simulated and observed data are, often based on a set of summary statistics when the data cannot be compared directly. Unless discrepancies and summaries are available from expert or prior knowledge, which seldom occurs, they have to be chosen and this can affect the quality of approximations. The choice between discrepancies is an active research topic, which has mainly considered data discrepancies requiring samples of observations or distances between summary statis- tics. In this work, we introduce a preliminary learning step in which surrogate posteriors are built using a specific instance of a Mixture of Experts model. These surrogate pos- teriors are then used in place of summary statistics and compared using metrics between distributions in place of data discrepancies. The resulting ABC quasi-posterior distribu- tion is shown to converge to the true one, under standard conditions. Experiments show that our approach is particularly useful when the posterior is multimodal.

Publication
53èmes Journées de Statistique de la Société Française de Statistique (SFdS)
TrungTin Nguyen
TrungTin Nguyen
Postdoctoral Research Fellow

A central theme of my research is data science at the intersection of statistical learning, machine learning and optimization.

Next
Previous

Related