14:00-15:30 p.m., May 29, 2023, GMT+8
Science Building 2, Room 2129E
Text generation is the task of writing meaningful and coherent text, and it is an umbrella term that covers various natural language generation applications such as summarization and machine translation. One of the major challenges in text generation is uncertainty, as natural language utterances are highly unpredictable. For example, a dialogue model may have multiple plausible responses when asked about its hobby, which makes training difficult as the model tries to learn all the different ways to respond. In this talk, I will go over two recent works of mine to address this problem. My first work focuses on using a mixture model for diverse dialogue generation, where we propose a novel EM algorithm, allowing different mixture components to capture distinct dialogue patterns. My second work addresses uncertainty in the sequential knowledge distillation setting, where we propose FDISTILL, a novel framework that formulates sequential knowledge distillation as minimizing a generalized f-divergence function. Both studies highlight the importance of tackling uncertainty in text generation.
Dr. Lili Mou is an Assistant Professor at the Department of Computing Science, University of Alberta. He is also an Alberta Machine Intelligence Institute (Amii) Fellow and a Canada CIFAR AI (CCAI) Chair. Lili received his BS and PhD degrees in 2012 and 2017, respectively, from School of EECS, Peking University. After that, he worked as a postdoctoral fellow at the University of Waterloo. His research interests include deep learning applied to natural language processing as well as programming language processing. He has publications at top conferences and journals, including AAAI, EMNLP, TACL, ICML, ICLR, and NeurIPS. He also presented tutorials at EMNLP'19 and ACL'20. He received a AAAI New Faculty Highlight Award in 2021.
Source: School of Computer Science