Biography

Hello and welcome! My Vietnamese name is Nguyễn Trung Tín. I therefore used “TrungTin Nguyen” or “Trung Tin Nguyen” in my English publications. The first name is also “Tín” or “Tin” for short.

I am currently a Postdoctoral Fellow at the Inria centre at the University Grenoble Alpes in the Statify team, where I am very fortunate to be mentored by Florence Forbes, Hien Duy Nguyen, and Julyan Arbel.

I completed my Ph.D. Degree in Statistics and Data Science at Normandie Univ in December 2021, where I am very fortunate to be advised by Faicel Chamroukhi. During my Ph.D. research, I am also very fortunate to collaborate with Geoff McLachlan, focusing on mixture models. I received a Visiting PhD Fellowship for 4 months at the Inria centre at the University Grenoble Alpes in the Statify team within a project LANDER.

A central theme of my research is data science, at the intersection of, at the interface of:

  • Statistical learning: Model selection (minimal penalties and slope heuristics, non-asymptotic oracle inequalities), simulation-based inference (approximate Bayesian computation, Bayesian synthetic likelihood, method of moments), Bayesian nonparametrics (Gibbs-type priors, Dirichlet process mixture), high-dimensional statistics (variable selection via Lasso and penalization, graphical models).
  • Machine learning: Supervised learning (deep hierarchical mixture of experts (DMoE), deep neural networks), unsupervised learning (clustering via mixture models, dimensionality reduction via principal component analysis, deep generative models via variational autoencoders, generative adversarial networks and normalizing flows), reinforcement learning (partially observable Markov decision process).
  • Optimization: Robust and effective optimization algorithms for DMoE (expectation–maximization (EM), variational Bayesian EM), difference of convex algorithm, optimal transport (Wasserstein distance).
  • Biostatistics: Statistical learning and machine learning for large biological data sets (genomics, transcriptomics, proteomics).

Interests

  • Data Science
  • Statistics
  • Artificial Intelligence

Education

  • Ph.D. in Statistics and Data Science, 2018-2021

    Université de Caen Normandie, France

  • M.S. in Applied Mathematics, 2017-2018

    Université d'Orléans, France

  • B.S. Honors Program in Mathematics and Computer Science, 2013-2017

    Vietnam National University-Ho Chi Minh Univeristy of Science, Vietnam

Publications

(2023). Bayesian nonparametric mixture of experts for high-dimensional inverse problems. hal-04015203.

PDF Project

(2023). Concentration results for approximate Bayesian computation without identifiability. hal-03987197.

PDF

(2022). Summary statistics and discrepancy measures for approximate Bayesian computation via surrogate posteriors. Statistics and Computing.

PDF DOI

(2022). A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts. Electronic Journal of Statistics.

PDF Project DOI

(2022). Mixture of expert posterior surrogates for approximate Bayesian computation. 53èmes Journées de Statistique de la Société Française de Statistique (SFdS).

PDF

(2022). Model selection by penalization in mixture of experts models with a non-asymptotic approach. 53èmes Journées de Statistique de la Société Française de Statistique (SFdS).

PDF Project

(2022). Approximation of probability density functions via location-scale finite mixtures in Lebesgue spaces. Communications in Statistics - Theory and Methods.

PDF Project

(2021). Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models. Journal of Statistical Distributions and Applications.

PDF Project

(2021). A non-asymptotic model selection in block-diagonal mixture of polynomial experts models. arXiv preprint arXiv:2104.08959.

PDF Project

(2020). An l1-oracle inequality for the Lasso in mixture-of-experts regression models. arXiv preprint arXiv:2009.10622..

PDF Project

(2020). Approximation by finite mixtures of continuous density functions that vanish at infinity. Cogent Mathematics & Statistics.

PDF Project

Recent & Upcoming Talks

Model selection by penalization in mixture of experts models with a non-asymptotic approach.
A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models.
A non-asymptotic approach for model selection via penalization in mixture of experts models
A non-asymptotic model selection in mixture of experts models
Model Selection and Approximation in High-dimensional Mixtures of Experts Models$:$ From Theory to Practice
Model Selection and Approximation in High-dimensional Mixtures of Experts Models From Theory to Practice
Approximation and non-asymptotic model selection in mixture of experts models
Approximate Bayesian computation with surrogate posteriors
A non-asymptotic model selection in mixture of experts models
Approximate Bayesian computation with surrogate posteriors
Distance-based ABC procedures
Non-asymptotic penalization criteria for model selection in mixture of experts models
Approximate Bayesian computation with surrogate posteriors