City Research Online

Generative Models from the perspective of Continual Learning

Garcia Ortiz, M. ORCID: 0000-0003-4729-7457, Lesort, T., Caselles-Dupré, H. , Goudou, J-F. & Filliat, D. (2019). Generative Models from the perspective of Continual Learning. Proceedings of the International Joint Conference on Neural Networks, pp. 1-8. ISSN 2161-4393 doi: 10.1109/IJCNN.2019.8851986


Which generative model is the most suitablefor Continual Learning? This paper aims at evaluating andcomparing generative models on disjoint sequential imagegeneration tasks. We investigate how several models learn andforget, considering various strategies: rehearsal, regularization,generative replay and fine-tuning. We used two quantitativemetrics to estimate the generation quality and memory ability.We experiment with sequential tasks on three commonly usedbenchmarks for Continual Learning (MNIST, Fashion MNISTand CIFAR10). We found that among all models, the originalGAN performs best and among Continual Learning strategies,generative replay outperforms all other methods. Even ifwe found satisfactory combinations on MNIST and FashionMNIST, training generative models sequentially on CIFAR10is particularly instable, and remains a challenge. Our code isavailable online.

Publication Type: Conference or Workshop Item (Paper)
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Science & Technology > Computer Science
Text - Accepted Version
Download (5MB) | Preview



Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login