Bayesian nonparametric mixture of experts for high-dimensional inverse problems

Abstract

A large class of problems can be formulated as inverse problems, where the goal is to find parameter values that best explain some observed measures. Typical constraints in practice are that the relationships between parameters and observations are highly nonlinear, with high-dimensional observations and multi-dimensionally correlated parameters. To deal with these constraints, we consider probabilistic mixtures of locally linear models using inverse regression strategies, namely the Gaussian locally linear mapping (GLLiM) models. These can be seen as special instances of a mixture of experts (MoE) models. The popularity of MoE is largely due to their universal approximation properties, provided that the number of mixture components is large enough. In this paper, we propose a general scheme to design a tractable Bayesian nonparametric GLLiM (BNP-GLLiM) model to avoid any commitment to an arbitrary number of components. A tractable estimation algorithm is designed using a variational Bayesian expectation-maximization. In particular, we establish posterior consistency for the number of components in BNP-GLLiM after the merge-truncate-merge algorithm post-processing. Illustrations on simulated data show good results in terms of recovering the true number of clusters and the mean regression function.

Publication
hal-04015203
TrungTin Nguyen
TrungTin Nguyen
Postdoctoral Research Fellow

A central theme of my research is data science at the intersection of statistical learning, machine learning and optimization.

Next
Previous

Related