Backpropagation Algorithm

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 10659 Experts worldwide ranked by ideXlab platform

P L Feintuch - One of the best experts on this subject based on the ideXlab platform.

  • statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
    IEEE Transactions on Signal Processing, 1993
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The single-layer Backpropagation Algorithm is a gradient-descent method that adjusts the connection weights of a single-layer perceptron to minimize the mean-square error at the output. It is similar to the standard least mean square al- gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis- tical analysis of the mean weight behavior of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper- plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden- tification model.

  • Statistical analysis of the single-layer Backpropagation Algorithm
    [Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The authors present a statistical analysis of the steady-state and transient properties of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the mean-square error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the mean-square error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >

N J Bershad - One of the best experts on this subject based on the ideXlab platform.

  • statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
    IEEE Transactions on Signal Processing, 1993
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The single-layer Backpropagation Algorithm is a gradient-descent method that adjusts the connection weights of a single-layer perceptron to minimize the mean-square error at the output. It is similar to the standard least mean square al- gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis- tical analysis of the mean weight behavior of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper- plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden- tification model.

  • Statistical analysis of the single-layer Backpropagation Algorithm
    [Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The authors present a statistical analysis of the steady-state and transient properties of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the mean-square error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the mean-square error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >

J J Shynk - One of the best experts on this subject based on the ideXlab platform.

  • statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
    IEEE Transactions on Signal Processing, 1993
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The single-layer Backpropagation Algorithm is a gradient-descent method that adjusts the connection weights of a single-layer perceptron to minimize the mean-square error at the output. It is similar to the standard least mean square al- gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis- tical analysis of the mean weight behavior of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper- plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden- tification model.

  • Statistical analysis of the single-layer Backpropagation Algorithm
    [Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991
    Co-Authors: N J Bershad, J J Shynk, P L Feintuch
    Abstract:

    The authors present a statistical analysis of the steady-state and transient properties of the single-layer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the mean-square error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the mean-square error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >

Ge Wang - One of the best experts on this subject based on the ideXlab platform.

  • generalized Backpropagation Algorithm for training second order neural networks
    International Journal for Numerical Methods in Biomedical Engineering, 2018
    Co-Authors: Wenxiang Cong, Ge Wang
    Abstract:

    : The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong nonlinear modeling ability, such as implementing basic fuzzy logic operations. In this paper, we develop a general Backpropagation Algorithm to train the network consisting of second-order neurons. The numerical studies are performed to verify the generalized Backpropagation Algorithm.

Masahiro Ishii - One of the best experts on this subject based on the ideXlab platform.

  • a modified error function for the Backpropagation Algorithm
    Neurocomputing, 2004
    Co-Authors: Xugang Wang, Zheng Tang, Hiroki Tamura, Masahiro Ishii
    Abstract:

    Abstract We have noted that the local minima problem in the Backpropagation Algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function. It can harmonize the update of weights connected to the hidden layer and those connected to the output layer by adding one term to the conventional error function. It can thus avoid the local minima problem caused by such disharmony. Simulations on a benchmark problem and a real classification task have been performed to test the validity of the modified error function.

  • an improved Backpropagation Algorithm to avoid the local minima problem
    Neurocomputing, 2004
    Co-Authors: Xugang Wang, Zheng Tang, Hiroki Tamura, Masahiro Ishii
    Abstract:

    Abstract We propose an improved Backpropagation Algorithm intended to avoid the local minima problem caused by neuron saturation in the hidden layer. Each training pattern has its own activation functions of neurons in the hidden layer. When the network outputs have not got their desired signals, the activation functions are adapted so as to prevent neurons in the hidden layer from saturating. Simulations on some benchmark problems have been performed to demonstrate the validity of the proposed method.