City Research Online

Generative Models from the perspective of Continual Learning

Garcia Ortiz, M. ORCID: 0000-0003-4729-7457, Lesort, T., Caselles-Dupré, H. , Goudou, J-F. & Filliat, D. (2019). Generative Models from the perspective of Continual Learning. In: Proceedings of the International Joint Conference on Neural Networks. Workshop on Continual Learning, NeurIPS 2018 - 32nd Conference on Neural Information Processing Systems, 07 December 2018, Montreal, Canada. doi: 10.1109/IJCNN.2019.8851986

Abstract

Which generative model is the most suitablefor Continual Learning? This paper aims at evaluating andcomparing generative models on disjoint sequential imagegeneration tasks. We investigate how several models learn andforget, considering various strategies: rehearsal, regularization,generative replay and fine-tuning. We used two quantitativemetrics to estimate the generation quality and memory ability.We experiment with sequential tasks on three commonly usedbenchmarks for Continual Learning (MNIST, Fashion MNISTand CIFAR10). We found that among all models, the originalGAN performs best and among Continual Learning strategies,generative replay outperforms all other methods. Even ifwe found satisfactory combinations on MNIST and FashionMNIST, training generative models sequentially on CIFAR10is particularly instable, and remains a challenge. Our code isavailable online.

Publication Type: Conference or Workshop Item (Paper)
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Science & Technology > Computer Science
[thumbnail of 1812.09111.pdf]
Preview
Text - Accepted Version
Download (5MB) | Preview

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login