City Research Online

ReLEx: Regularisation for Linear Extrapolationin Neural Networks with Rectified Linear Units

Lopedoto, E. and Weyde, T. ORCID: 0000-0001-8028-9905 (2020). ReLEx: Regularisation for Linear Extrapolationin Neural Networks with Rectified Linear Units. Paper presented at the AI-2020 Fortieth SGAI International Conference on Artificial Intelligence, 8-9 Dec 2020; 15-17 Dec 2020, Virtual.

Abstract

Despite the great success of neural networks in recent years, they are not providing useful extrapolation. In regression tasks, the popular Rectified Linear Units do enable unbounded linear extrapolation by neural networks, but their extrapolation behaviour varies widely and is largely independent of the training data. Our goal is instead to continue the local linear trend at the margin of the training data. Here we introduce ReLEx, a regularising method composed of a set of loss terms design to achieve this goal and reduce the variance of the extrapolation. We present a ReLEx implementation for single input, single output, and single hidden layer feed-forward networks. Our results demonstrate that ReLEx has little cost in terms of standard learning, i.e. interpolation, but enables controlled univariate linear extrapolation with ReLU neural networks.

Publication Type: Conference or Workshop Item (Paper)
Additional Information: The final authenticated version will be available online at https://www.springer.com/gb/computer-science/lncs
Publisher Keywords: Neural Networks, Regression, Regularisation, Extrapolation
Subjects: R Medicine > RC Internal medicine > RC0321 Neuroscience. Biological psychiatry. Neuropsychiatry
Departments: School of Mathematics, Computer Science & Engineering > Computer Science
Date Deposited: 24 Sep 2020 14:20
URI: https://openaccess.city.ac.uk/id/eprint/24941
[img]
Preview
Text - Accepted Version
Download (239kB) | Preview

Export

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login