Iterative Construction

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 20400 Experts worldwide ranked by ideXlab platform

Mark E. Oxley - One of the best experts on this subject based on the ideXlab platform.

  • MLP Iterative Construction algorithm
    Neurocomputing, 1997
    Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. Oxley
    Abstract:

    Abstract This paper presents a novel multi-layer perceptron neural network architecture selection and weight training algorithm for classification problems. The MLP Iterative Construction algorithm (MICA) autonomously constructs an MLP neural network as it trains. Experimental results show the algorithm achieves 100% accuracy on the training data, the same or better generalization accuracies as Backprop on the test data, while using less FLOPS. Moreover, relaxation of the hidden layer nodes improves test set recognition accuracies to be greater than that of Backprop. Furthermore, seeding the Backprop algorithm with the hidden layer weights from MICA is demonstrated. The MICA seeding improves the effectiveness of Backprop and enables Backprop to solve a new class of problems, i.e., problems with areas of low mean-squared error.

  • MLP Iterative Construction algorithm
    Applications and Science of Artificial Neural Networks III, 1997
    Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. Oxley
    Abstract:

    ABSTRACT TheMLP Iterative Construction Algorithm (MICA) designs a Multi-Layer Perceptron (MLP) neural network as it trains. MICA adds Hidden Layer Nodes (HLNs) one at a time, separating classes on a pair-wise basis, untilthe data is projected into a linear separable space by class. Then MICA trains the Output Layer Nodes (OLNs),which results in an MLP that achieves 100% accuracy on the training data. MICA, like Backprop,'2 produces an MLP that is a minimum mean squared error approximation of the Bayes optimal discriminant function. Moreover,MICA's training technique yields novel feature selection technique and hidden node pruning technique. 1 INTRODUCTION The MLP Iterative Construction Algorithm (MICA) constructs a Multi-Layer Perceptron (MLP) neural net-work for solving classification problems. MICA trains the Hidden Layer Nodes (HLNs), then trains the Output Layer Nodes (OLNs). The resulting MLP network correctly classifies the training data to 100% accuracy. MICA generalization results compare favorably to Backprop's. On difficult training sets such as spiral data, MICA is

Erwin Bolthausen - One of the best experts on this subject based on the ideXlab platform.

Thomas F. Rathbun - One of the best experts on this subject based on the ideXlab platform.

  • MLP Iterative Construction algorithm
    Neurocomputing, 1997
    Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. Oxley
    Abstract:

    Abstract This paper presents a novel multi-layer perceptron neural network architecture selection and weight training algorithm for classification problems. The MLP Iterative Construction algorithm (MICA) autonomously constructs an MLP neural network as it trains. Experimental results show the algorithm achieves 100% accuracy on the training data, the same or better generalization accuracies as Backprop on the test data, while using less FLOPS. Moreover, relaxation of the hidden layer nodes improves test set recognition accuracies to be greater than that of Backprop. Furthermore, seeding the Backprop algorithm with the hidden layer weights from MICA is demonstrated. The MICA seeding improves the effectiveness of Backprop and enables Backprop to solve a new class of problems, i.e., problems with areas of low mean-squared error.

  • MLP Iterative Construction algorithm
    Applications and Science of Artificial Neural Networks III, 1997
    Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. Oxley
    Abstract:

    ABSTRACT TheMLP Iterative Construction Algorithm (MICA) designs a Multi-Layer Perceptron (MLP) neural network as it trains. MICA adds Hidden Layer Nodes (HLNs) one at a time, separating classes on a pair-wise basis, untilthe data is projected into a linear separable space by class. Then MICA trains the Output Layer Nodes (OLNs),which results in an MLP that achieves 100% accuracy on the training data. MICA, like Backprop,'2 produces an MLP that is a minimum mean squared error approximation of the Bayes optimal discriminant function. Moreover,MICA's training technique yields novel feature selection technique and hidden node pruning technique. 1 INTRODUCTION The MLP Iterative Construction Algorithm (MICA) constructs a Multi-Layer Perceptron (MLP) neural net-work for solving classification problems. MICA trains the Hidden Layer Nodes (HLNs), then trains the Output Layer Nodes (OLNs). The resulting MLP network correctly classifies the training data to 100% accuracy. MICA generalization results compare favorably to Backprop's. On difficult training sets such as spiral data, MICA is

D. R. Karakhanyan - One of the best experts on this subject based on the ideXlab platform.

S E Derkachov - One of the best experts on this subject based on the ideXlab platform.