Skip to yearly menu bar Skip to main content


Poster

Ito Diffusion Approximation of Universal Ito Chains for Sampling, Optimization and Boosting

Aleksei Ustimenko · Aleksandr Beznosikov

Halle B #51
[ ]
Fri 10 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract: In this work, we consider rather general and broad class of Markov chains, Ito chains, that look like Euler-Maryama discretization of some Stochastic Differential Equation. The chain we study is a unified framework for theoretical analysis. It comes with almost arbitrary isotropic and state-dependent noise instead of normal and state-independent one as in most related papers. Moreover, in our chain the drift and diffusion coefficient can be inexact in order to cover wide range of applications as Stochastic Gradient Langevin Dynamics, sampling, Stochastic Gradient Descent or Stochastic Gradient Boosting. We prove the bound in $\mathcal{W}_{2}$-distance between the laws of our Ito chain and corresponding differential equation. These results improve or cover most of the known estimates. And for some particular cases, our analysis is the first.

Chat is not available.