Levenberg

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 17529 Experts worldwide ranked by ideXlab platform

G. Lera - One of the best experts on this subject based on the ideXlab platform.

  • Improvement of the neighborhood based Levenberg-Marquardt algorithm by local adaptation of the learning coefficient
    2005
    Co-Authors: A. Toledo, Miguel Pinzolas, J.j. Ibarrola, G. Lera
    Abstract:

    In this letter, an improvement of the recently developed neighborhood-based Levenberg-Marquardt (NBLM) algorithm is proposed and tested for neural network (NN) training. The algorithm is modified by allowing local adaptation of a different learning coefficient for each neighborhood. This simple add-in to the NBLM training method significantly increases the efficiency of the training episodes carried out with small neighborhood sizes, thus, allowing important savings in memory occupation and computational time while obtaining better performance than the original Levenberg-Marquardt (LM) and NBLM methods.

  • Neighborhood based Levenberg-Marquardt algorithm for neural network training
    2002
    Co-Authors: G. Lera, Miguel Pinzolas
    Abstract:

    Although the Levenberg-Marquardt (LM) algorithm has been extensively\napplied as a neural-network training method, it suffers from being\nvery expensive, both in memory and number of operations required,\nwhen the network to be trained has a significant number of adaptive\nweights. In this paper, the behavior of a recently proposed variation\nof this algorithm is studied. This new method is based on the application\nof the concept of neural neighborhoods to the LM algorithm. It is\nshown that, by performing an LM step on a single neighborhood at\neach training iteration, not only significant savings in memory occupation\nand computing effort are obtained, but also, the overall performance\nof the LM method can be increased.

Miguel Pinzolas - One of the best experts on this subject based on the ideXlab platform.

  • Improvement of the neighborhood based Levenberg-Marquardt algorithm by local adaptation of the learning coefficient
    2005
    Co-Authors: A. Toledo, Miguel Pinzolas, J.j. Ibarrola, G. Lera
    Abstract:

    In this letter, an improvement of the recently developed neighborhood-based Levenberg-Marquardt (NBLM) algorithm is proposed and tested for neural network (NN) training. The algorithm is modified by allowing local adaptation of a different learning coefficient for each neighborhood. This simple add-in to the NBLM training method significantly increases the efficiency of the training episodes carried out with small neighborhood sizes, thus, allowing important savings in memory occupation and computational time while obtaining better performance than the original Levenberg-Marquardt (LM) and NBLM methods.

  • Neighborhood based Levenberg-Marquardt algorithm for neural network training
    2002
    Co-Authors: G. Lera, Miguel Pinzolas
    Abstract:

    Although the Levenberg-Marquardt (LM) algorithm has been extensively\napplied as a neural-network training method, it suffers from being\nvery expensive, both in memory and number of operations required,\nwhen the network to be trained has a significant number of adaptive\nweights. In this paper, the behavior of a recently proposed variation\nof this algorithm is studied. This new method is based on the application\nof the concept of neural neighborhoods to the LM algorithm. It is\nshown that, by performing an LM step on a single neighborhood at\neach training iteration, not only significant savings in memory occupation\nand computing effort are obtained, but also, the overall performance\nof the LM method can be increased.

Inan Guler - One of the best experts on this subject based on the ideXlab platform.

  • multilayer perceptron neural networks to compute quasistatic parameters of asymmetric coplanar waveguides
    2004
    Co-Authors: Elif Derya Ubeyli, Inan Guler
    Abstract:

    Artificial neural networks (ANNs) have recently gained attention as fast and flexible vehicles to microwave modeling, simulation, and optimization. In this study, ANNs, based on the multilayer perceptron, were presented for accurate computation of the quasistatic parameters of asymmetric coplanar waveguides (ACPWs). Multilayer perceptron neural networks (MLPNNs) were trained with backpropagation, delta-bar-delta, extended delta-bar-delta, quick propagation, and Levenberg-Marquardt algorithms to compute the quasistatic parameters, the characteristic impedance and the effective dielectric constant, of the ACPWs. The results of the MLPNNs trained with the Levenberg-Marquardt algorithm for the quasistatic parameters of the ACPWs were in very good agreement with the results available in the literature obtained by using conformal-mapping technique.

  • detection of electrocardiographic changes in partial epileptic patients using lyapunov exponents with multilayer perceptron neural networks
    2004
    Co-Authors: Elif Derya Ibeyli, Inan Guler
    Abstract:

    In this study, a new approach based on the consideration that electrocardiogram (ECG) signals are chaotic signals was presented for detection of electrocardiographic changes in patients with partial epilepsy. This consideration was tested successfully using the nonlinear dynamics tools, like the computation of Lyapunov exponents. Multilayer perceptron neural network (MLPNN) architectures were formulated and used as basis for detection of electrocardiographic changes in patients with partial epilepsy. Two types of ECG beats (normal and partial epilepsy) were obtained from the MIT-BIH database. The computed Lyapunov exponents of the ECG signals were used as inputs of the MLPNNs trained with backpropagation, delta-bar-delta, extended delta-bar-delta, quick propagation, and Levenberg-Marquardt algorithms. The performances of the MLPNN classifiers were evaluated in terms of training performance and classification accuracies. Receiver operating characteristic (ROC) curves were used to assess the performance of the detection process. The results confirmed that the proposed MLPNN trained with the Levenberg-Marquardt algorithm has potential in detecting the electrocardiographic changes in patients with partial epilepsy.

Jemal H Abawajy - One of the best experts on this subject based on the ideXlab platform.

  • an accelerated particle swarm optimization based Levenberg marquardt back propagation algorithm
    2014
    Co-Authors: Nazri Mohd Nawi, Abdullah Khan, M Z Rehman, Maslina Abdul Aziz, Tutut Herawan, Jemal H Abawajy
    Abstract:

    The Levenberg Marquardt (LM) algorithm is one of the most effective algorithms in speeding up the convergence rate of the Artificial Neural Networks (ANN) with Multilayer Perceptron (MLP) architectures. However, the LM algorithm suffers the problem of local minimum entrapment. Therefore, we introduce several improvements to the Levenberg Marquardt algorithm by training the ANNs with meta-heuristic nature inspired algorithm. This paper proposes a hybrid technique Accelerated Particle Swarm Optimization using Levenberg Marquardt (APSO_LM) to achieve faster convergence rate and to avoid local minima problem. These techniques are chosen since they provide faster training for solving pattern recognition problems using the numerical optimization technique.The performances of the proposed algorithm is evaluated using some bench mark of classification’s datasets. The results are compared with Artificial Bee Colony (ABC) Algorithm using Back Propagation Neural Network (BPNN) algorithm and other hybrid variants.Based on the experimental result, the proposed algorithms APSO_LM successfully demonstrated better performance as compared to other existing algorithms in terms of convergence speed and Mean Squared Error (MSE) by introducing the error and accuracy in network convergence.

Jose Jesus De Rubio - One of the best experts on this subject based on the ideXlab platform.

  • Stability Analysis of the Modified Levenberg-Marquardt Algorithm for the Artificial Neural Network Training.
    2020
    Co-Authors: Jose Jesus De Rubio
    Abstract:

    The Levenberg-Marquardt and Newton are two algorithms that use the Hessian for the artificial neural network learning. In this article, we propose a modified Levenberg-Marquardt algorithm for the artificial neural network learning containing the training and testing stages. The modified Levenberg-Marquardt algorithm is based on the Levenberg-Marquardt and Newton algorithms but with the following two differences to assure the error stability and weights boundedness: 1) there is a singularity point in the learning rates of the Levenberg-Marquardt and Newton algorithms, while there is not a singularity point in the learning rate of the modified Levenberg-Marquardt algorithm and 2) the Levenberg-Marquardt and Newton algorithms have three different learning rates, while the modified Levenberg-Marquardt algorithm only has one learning rate. The error stability and weights boundedness of the modified Levenberg-Marquardt algorithm are assured based on the Lyapunov technique. We compare the artificial neural network learning with the modified Levenberg-Marquardt, Levenberg-Marquardt, Newton, and stable gradient algorithms for the learning of the electric and brain signals data set.