Yale Institute for Foundations of Data Science, Kline Tower 13th Floor, Room 1327, New Haven, CT 06511

Speaker: Yunbum Kook (Georgia Tech)
Hosted by: Andre Wibisono, Assistant Professor, Computer Science Department
Talk Title: Gaussian Cooling and the Dikin Walk: The Interior-Point Method for Logconcave Sampling
Abstract: The connections between convex optimization and logconcave sampling have been considerably enriched in the past decade with many conceptual and mathematical analogies. For instance, the Langevin algorithm can be viewed as a sampling analogue of gradient descent and has condition-number-dependent guarantees on its performance. In the early 1990s, Nesterov and Nemirovski developed the Interior-Point Method (IPM) for convex optimization based on self-concordant barriers, providing efficient algorithms for structured convex optimization, often faster than the general method. This raises the following question: Can we develop an analogous IPM for structured sampling problems?
In 2012, Kannan and Narayanan proposed the Dikin walk for uniformly sampling polytopes, and an improved analysis was given in 2020 by Laddha-Lee-Vempala. The Dikin walk uses a local metric defined by a self-concordant barrier for linear constraints. In this talk, we demonstrate the generality of this approach: developing and adapting IPM’s machinery with the Dikin walk for poly-time sampling algorithms. Our IPM-based sampling framework provides an efficient warm start and goes beyond uniform distributions and linear constraints. As an illustrative example, we demonstrate how this approach gives the fastest algorithms to sample uniform, exponential, or Gaussian distributions on a truncated PSD cone.
Bio: Yunbum Kook is a third-year PhD student in the School of Computer Science at Georgia Institute of Technology, advised by Santosh Vempala. He received B.S. in Mathematical Sciences from KAIST in 2021. He is interested in sampling and theoretical aspects of machine learning.
Website: https://yunbum-kook.github.io/