Convergence Rates for Mixtures of Experts
Apr 1, 2023
TrungTin Nguyen
Postdoctoral Research Fellow
A central theme of my research is data science at the intersection of statistical learning, machine learning and optimization.
Previous
Deep Neural Networks
Publications
A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts
Mixture-of-experts (MoE) model incorporates the power of multiple submodels via gating functions to achieve greater performance in …
Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts
Originally introduced as a neural network for ensemble learning, mixture of experts (MoE) has recently become a fundamental building …