Fairbank, M. and Alonso, E. (2012). Efficient Calculation of the GaussNewton Approximation of the Hessian Matrix in Neural Networks. Neural Computation, 24(3), pp. 607610. doi: 10.1162/NECO_a_00248
Abstract
The LevenbergMarquardt (LM) learning algorithm is a popular algorithm for training neural networks; however, for large neural networks, it becomes prohibitively expensive in terms of running time and memory requirements. The most timecritical step of the algorithm is the calculation of the GaussNewton matrix, which is formed by multiplying two large Jacobian matrices together. We propose a method that uses backpropagation to reduce the time of this matrixmatrix multiplication. This reduces the overall asymptotic running time of the LM algorithm by a factor of the order of the number of output nodes in the neural network.
Publication Type:  Article 

Additional Information:  The article has been published in Neural Computation. © 2014 The MIT Press 
Subjects:  Q Science > QA Mathematics > QA75 Electronic computers. Computer science 
Departments:  School of Mathematics, Computer Science & Engineering > Computer Science 
Date Deposited:  23 Oct 2014 14:04 
URI:  https://openaccess.city.ac.uk/id/eprint/4369 

PDF
 Published Version
Download (55kB)  Preview 
Export
Downloads
Downloads per month over past year
Actions (login required)
Admin Login 