机器学习与数据科学博士生系列论坛(第九十一期)—— Shifted Composition for Bounding Information-Theoretic Divergences
摘要:
Bounding the divergence between the laws of two stochastic processes is a classical topic with many applications in sampling and other fields.
Standard approaches, such as the Girsanov method and the interpolation method, are applied to control the error in KL divergence for some
basic sampling algorithms. However, it is not known how these methods can be used for more general algorithms or in more general settings.
In this talk, we introduce the shifted composition rule, based on several works by Altschuler et al. . This information-theoretic principle is applied
to develop a user-friendly framework for bounding the long-time discretization error for sampling algorithms. We also introduce its application
in proving reverse transport inequalities for diffusions.
论坛简介:
该线上论坛每两周主办一次,每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化
和理论计算机科学。
腾讯会议:989-3593-2097