Parallel and Separable Recursive Levenberg- Marquardt Training Algorithm
A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient par...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Book Section |
Published: |
IEEE Press
2002
|
Subjects: | |
Online Access: | http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1030011 http://eprints.utp.edu.my/3827/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient parallel manner. A separable least squares implementation of decomposed RLM is also introduced. Experiment results for two nonlinear time series
problems demonstrate the superiority of the new training algorithms. |
---|