Semi-implicit Functional Gradient Flow (半隐式函数型梯度流)

Sunday, Jun.22, 2025


Time:   10:10 a.m. — 10:50 a.m.
Location: 武汉大学-雷军楼一楼报告厅 

Cheng Zhang
Peking University
Title:  Semi-implicit Functional Gradient Flow (半隐式函数型梯度流)
Abstract:   Particle-based variational inference methods (ParVIs) use nonparametric variational families represented by particles to approximate the target distribution according to the kernelized Wasserstein gradient flow for the Kullback-Leibler (KL) divergence. Although functional gradient flows have been introduced to expand the kernel space for better flexibility, the deterministic updating mechanism may limit exploration and require expensive repetitive runs for new samples. In this paper, we propose Semi-Implicit Functional Gradient flow (SIFG), a functional gradient ParVI method that uses perturbed particles with Gaussian noise as the approximation family. We show that the corresponding functional gradient flow, which can be estimated via denoising score matching with neural networks, exhibits strong theoretical convergence guarantees due to a higher-order smoothness brought to the approximation family via Gaussian perturbation. In addition, we present an adaptive version of our method that automatically selects the appropriate noise magnitude during sampling, striking a good balance between exploration efficiency and approximation accuracy. Extensive experiments on both simulated and real-world datasets demonstrate the effectiveness and efficiency of the proposed framework.
CV:   北京大学数学科学学院概率统计系助理教授,博士生导师,大数据分析与应用国家工程实验室联聘研究员。研究兴趣主要在贝叶斯推断、机器学习和计算生物学的交叉领域。等变标准化流变分贝叶斯演化推断的工作受邀在2020年NeurIPS大会作口头报告。在Journal of The American Statistical Association, Journal of Machine Learning Research, Bayesian Analysis, Statistics and Computing, NeurIPS, ICML, ICLR 等统计学著名期刊以及机器学习顶级会议以第一作者/通讯作者发表20余篇论文。