Michael Minyi Zhang
Michael Zhang is currently an assistant professor since January 2021 in the Department of Statistics and Actuarial Science at the University of Hong Kong. His research interests include statistical machine learning, scalable inference and Bayesian non-parametrics. Michael Zhang was a post-doctoral researcher at Princeton University under the supervision of Profs. Barbara Engelhardt and Brandon Stewart and earned a Ph.D. in statistics at the University of Texas at Austin where he was advised by Prof. Sinead Williamson.
- A new revision of Sequential Gaussian Processes for Online Learning of Nonstationary Functions is available, with code available here.
- Accelerated Parallel Non-conjugate Sampling for Bayesian Non-parametric Models is now available in Statistics and Computing.
- Sparse Infinite Random Feature Latent Variable Modeling is now available on arXiv.
- Latent Variable Modeling with Random Features has been accepted to AISTATS 2021.
- Our new preprint, Accelerated Algorithms for Convex and Non-Convex Optimization on Manifolds, is now available on arxiv.
- Distributed, partially collapsed MCMC for Bayesian Nonparametrics and Patient-Specific Effects of Medication Using Latent Force Models with Gaussian Processes have been published in AISTATS 2020.
- A New Class of Time-dependent Latent Factor Models with Applications has been published in the Journal of Machine Learning Research.
- Embarassingly Parallel Inference for Gaussian Processes has been published in the Journal of Machine Learning Research. I am happy to announce I will give a contributed talk on this paper at Bayes Comp 2020.