Jun Liu

Affiliation:Tsinghua University
Dr. Jun Liu is Xinghua Distinguished University Professor, Chair of the Department of Statistics and Data Science at Tsinghua University, and a member of the National Academy of Sciences of the USA. He received his BS degree in mathematics in 1985 from Peking University and Ph.D in statistics in 1991 from the University of Chicago. From 1991-2025, he held Assistant, Associate, and Full professorship at Harvard and Stanford Universities.  Liu won the COPSS Presidents' Award in 2002, the Morningside Gold Medal in Applied Mathematics in 2010, and the Pao-Lu Hsu Award by ICSA in 2016. He was elected to Fellow of IMS, ASA, and ISCB in 2004, 2005, and 2022, respectively, and to the National Academy of Sciences of the USA in 2025. Liu has served as the co-editor for the flagship statistics journal JASA from 2011-2014, as associate editor for leading statistical journals, and as a committee chair or member for various grant review panels. Dr. Liu has co-authored over 300 research articles published in leading scientific journals, conferences and books, with a Google Scholar citation count of more than 97,000. Over the past four decades, he has mentored more than 40 PhD students and 30 postdoctoral fellows. Liu’s research interests are: Bayesian methods and computation, statistical machine learning and AI, Monte Carlo methods, state-space models, bioinformatics and computational biology.
Title: Conditional sampling via diffusion flow and SMC
Abstract: Sequential Monte Carlo, aka particle filtering, refers to a class of Monte Carlo methods that accommodates dynamic structures and can be used as a learning mechanism. The scheme starts by creating the sampling distribution recursively and adjusting the obtained samples by sequentially adjusted weights so as to “learn” when new information is available. Recently, diffusion models have become a very popular tool for learning a high-dimensional data-distribution and generating from it. I will review a brief history of SMC and some developments in diffusion modeling. By combining the ODE-based flow method and SMC, we propose a training-free conditional sampling method for diffusion models. Because a naive application of importance sampling suffers from weight degeneracy in high-dimensional settings, ideas of resampling and rejection sampling are necessary. To encourage generated samples to diverge along distinct trajectories, we derive a stochastic flow with adjustable noise strength to replace the deterministic flow at the intermediate stage. Experimentally, our method significantly outperforms existing approaches on conditional sampling tasks for MNIST and CIFAR-10.
Hello, I am your AI conference assistant! You can try to arrange the following tasks:
To facilitate your participation, you may require the following services:
{{item.question}}

{{ai_type_list[item.type_index]?.name}}

Are you satisfied: Yes No
In deep thought
{{ 'en' == 'cn' ? ai_type_list[ai_type_index]?.name : ai_type_list[ai_type_index]?.name_en }}

{{ 'en' == 'cn' ? item.name : item.name_en }}

{{ 'en' == 'cn' ? item.desc : item.desc_en }}

send