City Research Online

Generative Models from the perspective of Continual Learning

Garcia Ortiz, M. ORCID: 0000-0003-4729-7457, Lesort, T., Caselles-Dupré, H., Goudou, J-F. and Filliat, D. (2018). Generative Models from the perspective of Continual Learning. Paper presented at the Workshop on Continual Learning, NeurIPS 2018 - 32nd Conference on Neural Information Processing Systems, 07 December 2018, Montreal, Canada.


Which generative model is the most suitablefor Continual Learning? This paper aims at evaluating andcomparing generative models on disjoint sequential imagegeneration tasks. We investigate how several models learn andforget, considering various strategies: rehearsal, regularization,generative replay and fine-tuning. We used two quantitativemetrics to estimate the generation quality and memory ability.We experiment with sequential tasks on three commonly usedbenchmarks for Continual Learning (MNIST, Fashion MNISTand CIFAR10). We found that among all models, the originalGAN performs best and among Continual Learning strategies,generative replay outperforms all other methods. Even ifwe found satisfactory combinations on MNIST and FashionMNIST, training generative models sequentially on CIFAR10is particularly instable, and remains a challenge. Our code isavailable online.

Publication Type: Conference or Workshop Item (Paper)
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Mathematics, Computer Science & Engineering > Computer Science
Date Deposited: 08 Aug 2019 14:47
Text - Accepted Version
Download (5MB) | Preview



Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login