Michael Minyi Zhang
Michael Zhang is currently a post-doctoral researcher at Princeton University in the Department of Computer Science since October 2018 under the supervision of Profs. Barbara Engelhardt and Brandon Stewart. His research interests include statistical machine learning, scalable inference and Bayesian non-parametrics. Michael Zhang earned a Ph.D. in statistics at the University of Texas at Austin where he was advised by Prof. Sinead Williamson.
- I am currently on the academic job market. You may read my research statement and teaching statement here.
- I am happy to announce Distributed, partially collapsed MCMC for Bayesian Nonparametrics and Patient-Specific Effects of Medication Using Latent Force Models with Gaussian Processes has been accepted to AISTATS 2020.
- Embarassingly Parallel Inference for Gaussian Processes has been published in the Journal of Machine Learning Research. I am happy to announce I will give a contributed talk on this paper at Bayes Comp 2020.
- I have uploaded a new version of Accelerated Parallel Non-conjugate Sampling for Bayesian Non-parametric Models (previously known as "Accelerated Inference for Latent Variable Models") on arxiv with code available.
- A new revision of Sequential Gaussian Processes for Online Learning of Nonstationary Functions is available.
- Probabilistic Time of Arrival Localization has been accepted for publication in IEEE Signal Processing Letters.
- A new class of time-dependent latent factor models with applications has been accepted for publication at the Journal of Machine Learning Research.