Jun Liu
Tsinghua University
Dr. Jun Liu is Xinghua Distinguished University Professor, Chair of the Department of Statistics and Data Science at Tsinghua University, and a member of the National Academy of Sciences of the USA. He received his BS degree in mathematics in 1985 from Peking University and Ph.D in statistics in 1991 from the University of Chicago. From 1991-2025, he held Assistant, Associate, and Full professorship at Harvard and Stanford Universities. Liu won the COPSS Presidents' Award in 2002, the Morningside Gold Medal in Applied Mathematics in 2010, and the Pao-Lu Hsu Award by ICSA in 2016. He was elected to Fellow of IMS, ASA, and ISCB in 2004, 2005, and 2022, respectively, and to the National Academy of Sciences of the USA in 2025. Liu has served as the co-editor for the flagship statistics journal JASA from 2011-2014, as associate editor for leading statistical journals, and as a committee chair or member for various grant review panels. Dr. Liu has co-authored over 300 research articles published in leading scientific journals, conferences and books, with a Google Scholar citation count of more than 97,000. Over the past four decades, he has mentored more than 40 PhD students and 30 postdoctoral fellows. Liu’s research interests are: Bayesian methods and computation, statistical machine learning and AI, Monte Carlo methods, state-space models, bioinformatics and computational biology.
Title: Conditional sampling via diffusion flow and SMC
Abstract: Sequential Monte Carlo, aka particle filtering, refers to a class of Monte Carlo methods that accommodates dynamic structures and can be used as a learning mechanism. The scheme starts by creating the sampling distribution recursively and adjusting the obtained samples by sequentially adjusted weights so as to “learn” when new information is available. Recently, diffusion models have become a very popular tool for learning a high-dimensional data-distribution and generating from it. I will review a brief history of SMC and some developments in diffusion modeling. By combining the ODE-based flow method and SMC, we propose a training-free conditional sampling method for diffusion models. Because a naive application of importance sampling suffers from weight degeneracy in high-dimensional settings, ideas of resampling and rejection sampling are necessary. To encourage generated samples to diverge along distinct trajectories, we derive a stochastic flow with adjustable noise strength to replace the deterministic flow at the intermediate stage. Experimentally, our method significantly outperforms existing approaches on conditional sampling tasks for MNIST and CIFAR-10.
Richard J. Samworth
University of Cambridge
Richard Samworth obtained his PhD in Statistics from the University of Cambridge in 2004, and has remained in Cambridge since, becoming a full professor in 2013 and the Professor of Statistical Science in 2017. His main research interests are in nonparametric and high-dimensional statistics, as well as the statistical foundations of AI; he has developed methods and theory for shape-constrained inference, missing data, subgroup selection, deep learning, data perturbation techniques, changepoint estimation, variable selection and independence testing. Richard received the COPSS Presidents' Award in 2018, was elected as a Fellow of the Royal Society in 2021 and was awarded the David Cox Medal for Statistics in 2025. He served as co-editor of the Annals of Statistics (2019-2021) and is currently IMS President-Elect.
Title: Outrigger local polynomial regression
Abstract: Standard local polynomial estimators of a nonparametric regression function employ a weighted least squares loss function that is tailored to the setting of homoscedastic Gaussian errors. We introduce the outrigger local polynomial estimator, which is designed to achieve distributional adaptivity across different conditional error distributions. It modifies a standard local polynomial estimator by employing an estimate of the conditional score function of the errors and an 'outrigger' that draws on the data in a broader local window to stabilise the influence of the conditional score estimate. Subject to smoothness and moment conditions, and only requiring consistency of the conditional score estimate, we first establish that even under the least favourable settings for the outrigger estimator, the asymptotic ratio of the worst-case local risks of the two estimators is at most 1, with equality if and only if the conditional error distribution is Gaussian. Moreover, we prove that the outrigger estimator is minimax optimal over Hölder classes up to a multiplicative factor $A_{\beta,d}$, depending only on the smoothness $\beta \in (0,\infty)$ of the regression function and the dimension $d$ of the covariates. When $\beta \in (0,1]$, we find that $A_{\beta,d} \leq 1.69$, with $\lim_{\beta \searrow 0} A_{\beta,d} =1$. A further attraction of our proposal is that we do not require structural assumptions such as independence of errors and covariates, or symmetry of the conditional error distribution. Numerical results on simulated and real data validate our theoretical findings; our methodology is implemented in the R package \texttt{outrigger}.