Scan Science and Technology
Contact Leading Edge Experts & Companies
Backpropagation Algorithm
The Experts below are selected from a list of 10659 Experts worldwide ranked by ideXlab platform
P L Feintuch – 1st expert on this subject based on the ideXlab platform

statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
IEEE Transactions on Signal Processing, 1993CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The singlelayer Backpropagation Algorithm is a gradientdescent method that adjusts the connection weights of a singlelayer perceptron to minimize the meansquare error at the output. It is similar to the standard least mean square al gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis tical analysis of the mean weight behavior of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden tification model.

Statistical analysis of the singlelayer Backpropagation Algorithm
[Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The authors present a statistical analysis of the steadystate and transient properties of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the meansquare error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the meansquare error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >
N J Bershad – 2nd expert on this subject based on the ideXlab platform

statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
IEEE Transactions on Signal Processing, 1993CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The singlelayer Backpropagation Algorithm is a gradientdescent method that adjusts the connection weights of a singlelayer perceptron to minimize the meansquare error at the output. It is similar to the standard least mean square al gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis tical analysis of the mean weight behavior of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden tification model.

Statistical analysis of the singlelayer Backpropagation Algorithm
[Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The authors present a statistical analysis of the steadystate and transient properties of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the meansquare error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the meansquare error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >
J J Shynk – 3rd expert on this subject based on the ideXlab platform

statistical analysis of the single layer Backpropagation Algorithm i mean weight behavior
IEEE Transactions on Signal Processing, 1993CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The singlelayer Backpropagation Algorithm is a gradientdescent method that adjusts the connection weights of a singlelayer perceptron to minimize the meansquare error at the output. It is similar to the standard least mean square al gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis tical analysis of the mean weight behavior of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper plane decision boundary. It is demonstrated that, although the weights grow unbounded, the Algorithm, on average, quickly learns the correct hyperplane associated with the system iden tification model.

Statistical analysis of the singlelayer Backpropagation Algorithm
[Proceedings] ICASSP 91: 1991 International Conference on Acoustics Speech and Signal Processing, 1991CoAuthors: N J Bershad, J J Shynk, P L FeintuchAbstract:The authors present a statistical analysis of the steadystate and transient properties of the singlelayer Backpropagation Algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyperplane decision boundary. It is demonstrated that, although the weights grow unbounded, the meansquare error decreases towards zero. These results indicate that the Algorithm, on average, quickly learns the correct hyperplane associated with the system identification model. However, the nature of the meansquare error and the corresponding performance surface are such that the perceptron is prevented from correctly classifying with probability one until the weights converge at infinity. >