Continual Learning meets Generative Modeling
Neural networks experience catastrophic forgetting, resulting in a sudden loss of performance on previous tasks when learning new ones. Retraining models from scratch is often impractical due to data size and time constraints. This limitation hampers neural networks’ capabilities, motivating the need for continual learning. In this talk, I will discuss recent most promising continual learning methods including several works employing generative models.
Bio
Kamil Deja is a postdoc at IDEAS NCBR and Warsaw University of Technology. He was a Visiting Researcher at Vrije University of Amsterdam in 2022 and a science intern at Amazon Alexa in 2021 and 2022. Since 2018, he has been a member of the ALICE Collaboration at CERN. His research focuses on Generative Modelling and its application to continual learning. He published his works in major ML journals and conferences, including NeurIPS, IJCAI, Interspeech, and ICASSP.