Parallel and Separable Recursive Levenberg- Marquardt Training Algorithm

A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient par...

Full description

Saved in:
Bibliographic Details
Main Authors: Asirvadam , Vijanth Sagayan, McLoone, Sean, Irwin, George
Other Authors: Herve , Bourlard
Format: Book Section
Published: IEEE Press 2002
Subjects:
Online Access:http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1030011
http://eprints.utp.edu.my/3827/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient parallel manner. A separable least squares implementation of decomposed RLM is also introduced. Experiment results for two nonlinear time series problems demonstrate the superiority of the new training algorithms.