Feed-Forward Neural Networks Need Inductive Bias to Learn Equality Relations
Weyde, T. ORCID: 0000-0001-8028-9905 & Kopparti, R. M. (2018). Feed-Forward Neural Networks Need Inductive Bias to Learn Equality Relations. Paper presented at the 32nd Conference on Neural Information Processing Systems (NIPS 2018), 2-8 Dec 2018, Montreal, Canada.
Abstract
Basic binary relations such as equality and inequality are fundamental to relational data structures. Neural networks should learn such relations and generalise to new unseen data. We show in this study, however, that this generalisation fails with standard feed-forward networks on binary vectors. Even when trained with maximal training data, standard networks do not reliably detect equality.
We introduce differential rectifier (DR) units that we add to the network in different configurations. The DR units create an inductive bias in the networks, so that they do learn to generalise, even from small numbers of examples and we have not found any negative effect of their inclusion in the network. Given the fundamental nature of these relations, we hypothesize that feed-forward neural network learning benefits from inductive bias in other relations as well. Consequently, the further development of suitable inductive biases will be beneficial to many tasks in relational learning with neural networks.
Publication Type: | Conference or Workshop Item (Paper) |
---|---|
Additional Information: | Relational Representation Learning Workshop, NeurIPS 2018 |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science R Medicine > RC Internal medicine > RC0321 Neuroscience. Biological psychiatry. Neuropsychiatry |
Departments: | School of Science & Technology > Computer Science |
Download (266kB) | Preview
Export
Downloads
Downloads per month over past year