Inductive learning in Shared Neural Multi-Spaces
Howe, J. M., Mota, E.D. & Garcez, A. (2017). Inductive learning in Shared Neural Multi-Spaces. CEUR Workshop Proceedings, 2003,
Abstract
The learning of rules from examples is of continuing interest to machine learning since it allows generalization from fewer training ex- amples. Inductive Logic Programming (ILP) generates hypothetical rules (clauses) from a knowledge base augmented with (positive and negative) examples. A successful hypothesis entails all positive examples and does not entail any negative example. The Shared Neural Multi-Space (Shared NeMuS) structure encodes first order expressions in a graph suitable for ILP-style learning. This paper explores the NeMuS structure and its re- lationship with the Herbrand Base of a knowledge-base to generate hy- potheses inductively. It is demonstrated that inductive learning driven by the knowledge-base structure can be implementated successfully in the Amao cognitive agent framework, including the learning of recursive hypotheses.
Publication Type: | Article |
---|---|
Additional Information: | Copyright © 2017 for the individual papers by the papers' authors. Copying permitted for private and academic purposes. This volume is published and copyrighted by its editors. |
Departments: | School of Science & Technology > Computer Science |
Download (286kB) | Preview
Export
Downloads
Downloads per month over past year