Bayesian nonparametrics in mixture of experts models

Bayesian Nonparametrics in Mixture of Experts Models

A wide class of problems can be formulated as inverse problems where the goal is to find parameter values that best explain some observed measures. Typical constraints in practice are that relationships between parameters and observations are highly non-linear, with high-dimensional observations and multi-dimensional correlated parameters. To handle these constraints, we consider probabilistic mixtures of locally linear models, which can be seen as particular instances of mixtures of experts (MoE). We have shown in previous studies that such models had a good approximation ability provided the number of experts was large enough. This contribution is to propose a general scheme to design a tractable Bayesian nonparametric (BNP) MoE model to avoid any commitment to an arbitrary number of experts. A tractable estimation algorithm is designed using a variational approximation and theoretical properties are derived on the predictive distribution and the number of components. Illustrations on simulated and real data show good results in terms of selection and computing time compared to more traditional model selection procedures.