We will discuss two recent works related to the analytical and numerical study of stochastic differential equations with additive noise. First, we will discuss the theoretical understanding of the complexity of simulating Langevin dynamics. With limited resources, it is impossible to design an algorithm with an arbitrary high order. For simulating underdamped Langevin dynamics, we prove that in certain scenarios, the randomized midpoint method is order-optimal, meaning it is the best possible algorithm to expect. Second, we study diffusion models, a deep-learning method that has received much attention in generating artificial images. We aim to answer the following question: how does the model selection in the generative process affect the sample generation quality for diffusion models? Is an ODE better or an SDE with a non-vanishing diffusion coefficient? We have identified situations where one dynamics is better than the other, with numerical validation.

Speaker

 A/Prof. Yu Cao

Institute of Natural Sciences

Time

        2023.5.31 12:00-13:30