FocusLearn: Fully-Interpretable, High-Performance Modular Neural Networks for Time Series
Su, Q., Kloukinas, C. ORCID: 0000-0003-0424-7425 & d’Avila Garcez, A. ORCID: 0000-0001-7375-9518 (2024). FocusLearn: Fully-Interpretable, High-Performance Modular Neural Networks for Time Series. In: 2024 International Joint Conference on Neural Networks (IJCNN). 2024 International Joint Conference on Neural Networks (IJCNN), 30 Jun - 5 Jul 2024, Yokohama, Japan. doi: 10.1109/ijcnn60899.2024.10651481
Abstract
Multivariate time series have had many applications in areas from healthcare and finance to meteorology and life sciences. Although deep neural networks have shown excellent predictive performance for time series, they have been criticised for being non-interpretable. Neural Additive Models, however, are known to be fully-interpretable by construction, but may achieve far lower predictive performance than deep networks when applied to time series. This paper introduces FocusLearn, a fully-interpretable modular neural network capable of matching or surpassing the predictive performance of deep networks trained on multivariate time series. In FocusLearn, a recurrent neural network learns the temporal dependencies in the data, while a multi-headed attention layer learns to weight selected features while also suppressing redundant features. Modular neural networks are then trained in parallel and independently, one for each selected feature. This modular approach allows the user to inspect how features influence outcomes in the exact same way as with additive models. Experimental results show that this new approach outperforms additive models in both regression and classification of time series tasks, achieving predictive performance that is comparable to state-of-the-art, non-interpretable deep networks applied to time series.
Publication Type: | Conference or Workshop Item (Paper) |
---|---|
Additional Information: | © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Subjects: | H Social Sciences > HN Social history and conditions. Social problems. Social reform Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Departments: | School of Science & Technology School of Science & Technology > Computer Science |
SWORD Depositor: |
Download (2MB) | Preview
Export
Downloads
Downloads per month over past year