Michael Minyi Zhang
Michael Zhang is currently an assistant professor since January 2021 in the Department of Statistics and Actuarial Science at the University of Hong Kong. His research interests include statistical machine learning, scalable inference and Bayesian non-parametrics. Michael Zhang was a post-doctoral researcher at Princeton University under the supervision of Profs. Barbara Engelhardt and Brandon Stewart and earned a Ph.D. in statistics at the University of Texas at Austin where he was advised by Prof. Sinead Williamson.
News
- I am happy to announce I have received a Sino-British Fellowship Trust Visitorship.
- I will serve as an area chair at AISTATS 2025.
- A Semi-Bayesian Nonparametric Hypothesis Test Using Maximum Mean Discrepancy with Applications in Generative Adversarial Networks has been published in Transactions on Machine Learning Research.
- I am now an Associate Editor at Statistics and Computing.
- Preventing Model Collapse in Gaussian Process Latent Variable Models, has been accepted to ICML 2024.
- Online/Offline Learning to Enable Robust Beamforming: Limited Feedback Meets Deep Generative Models, is now available on arxiv.
- Accelerated Algorithms for Convex and Non-Convex Optimization on Manifolds, has been accepted to Machine Learning.
- Online Student-t Processes with an Overall-local Scale Structure for Modelling Non-stationary Data is now available on arxiv.
- A BNP Approach to Generative Models: Integrating VAEs and GANs using WMMD is now available on arxiv.
- Bayesian Non-linear Latent Variable Modeling via Random Fourier Features has been submitted for review with code available here.
- Overcoming Posterior Collapse in Variational Autoencoders via EM-type Training has been published in ICASSP 2023 and accepted for an oral presentation.
- Sequential Gaussian Processes for Online Learning of Nonstationary Functions has been published in IEEE Transactions on Signal Processing, with code available here.
- Accelerated Parallel Non-conjugate Sampling for Bayesian Non-parametric Models is now available in Statistics and Computing.
- Sparse Infinite Random Feature Latent Variable Modeling is now available on arXiv.
- Latent Variable Modeling with Random Features has been accepted to AISTATS 2021.