Convergence Rates for Mixtures of Experts
Apr 1, 2023
![](/project/convergence-rates-for-mixtures-of-experts/featured_huf2eabe7fe59e7e31202d06810420d24d_222070_720x0_resize_lanczos_3.png)
![TrungTin Nguyen](/author/trungtin-nguyen/avatar_hued61db7a7fd767a00e76edbe4a3b2c83_1146978_270x270_fill_q90_lanczos_center.jpeg)
TrungTin Nguyen
Postdoctoral Research Fellow
A central theme of my research is data science at the intersection of statistical learning, machine learning and optimization.
Previous
Deep Neural Networks
Publications
A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts
Mixture-of-experts (MoE) model incorporates the power of multiple submodels via gating functions to achieve greater performance in …
Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts
Originally introduced as a neural network for ensemble learning, mixture of experts (MoE) has recently become a fundamental building …