Luo Weijian, PKU
16:00-17:00 p.m., January 18, 2024, GMT+8
https://meeting.tencent.com, Meeting ID：551-1675-5419
Diffusion models have become one of the foundation models for generative modeling, with numerous successful applications such as image and video generation, molecule designs, and controllable data creation, etc. The generation process of diffusion models requires solving generative ordinary differential equations (ODEs) or stochastic differential equations (SDEs), which are often computationally inefficient for high dimensional data. In recent years, there has been a wide range of attempts to improve the data generation efficiency of diffusion models from different perspectives. In this talk, we will give a brief introduction to existing diffusion distillation methods which are proposed from different perspectives to accelerate the efficiency of diffusion models. Especially, we will focus on the Diff-Instruct method that achieves strong one-step diffusion distillation performance in a distribution-matching manner. Moreover, we will also compare and summarize other distillation approaches to give an intuitive understanding of diffusion distillation.
School of Mathematical Sciences, PKU