The Experts below are selected from a list of 20400 Experts worldwide ranked by ideXlab platform
Mark E. Oxley - One of the best experts on this subject based on the ideXlab platform.
-
MLP Iterative Construction algorithm
Neurocomputing, 1997Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. OxleyAbstract:Abstract This paper presents a novel multi-layer perceptron neural network architecture selection and weight training algorithm for classification problems. The MLP Iterative Construction algorithm (MICA) autonomously constructs an MLP neural network as it trains. Experimental results show the algorithm achieves 100% accuracy on the training data, the same or better generalization accuracies as Backprop on the test data, while using less FLOPS. Moreover, relaxation of the hidden layer nodes improves test set recognition accuracies to be greater than that of Backprop. Furthermore, seeding the Backprop algorithm with the hidden layer weights from MICA is demonstrated. The MICA seeding improves the effectiveness of Backprop and enables Backprop to solve a new class of problems, i.e., problems with areas of low mean-squared error.
-
MLP Iterative Construction algorithm
Applications and Science of Artificial Neural Networks III, 1997Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. OxleyAbstract:ABSTRACT TheMLP Iterative Construction Algorithm (MICA) designs a Multi-Layer Perceptron (MLP) neural network as it trains. MICA adds Hidden Layer Nodes (HLNs) one at a time, separating classes on a pair-wise basis, untilthe data is projected into a linear separable space by class. Then MICA trains the Output Layer Nodes (OLNs),which results in an MLP that achieves 100% accuracy on the training data. MICA, like Backprop,'2 produces an MLP that is a minimum mean squared error approximation of the Bayes optimal discriminant function. Moreover,MICA's training technique yields novel feature selection technique and hidden node pruning technique. 1 INTRODUCTION The MLP Iterative Construction Algorithm (MICA) constructs a Multi-Layer Perceptron (MLP) neural net-work for solving classification problems. MICA trains the Hidden Layer Nodes (HLNs), then trains the Output Layer Nodes (OLNs). The resulting MLP network correctly classifies the training data to 100% accuracy. MICA generalization results compare favorably to Backprop's. On difficult training sets such as spiral data, MICA is
Erwin Bolthausen - One of the best experts on this subject based on the ideXlab platform.
-
an Iterative Construction of solutions of the tap equations for the sherrington kirkpatrick model
Communications in Mathematical Physics, 2014Co-Authors: Erwin BolthausenAbstract:We propose an Iterative scheme for the solutions of the TAP-equations in the Sherrington–Kirkpatrick model which is shown to converge up to and including the de Almeida–Thouless line. The main tool is a representation of the iterations which reveals an interesting structure of them. This representation does not depend on the temperature parameter, but for temperatures below the de Almeida–Thouless line, it contains a part which does not converge to zero in the limit.
-
an Iterative Construction of solutions of the tap equations for the sherrington kirkpatrick model
arXiv: Probability, 2012Co-Authors: Erwin BolthausenAbstract:We propose an Iterative Construction of solutions of the Thouless-Anderson-Palmer-equations for the Sherrington-Kirpatrick model. The Iterative scheme is proved to converge exactly up to the de Almayda-Thouless-line. No results on the SK-model itself are derived.
Thomas F. Rathbun - One of the best experts on this subject based on the ideXlab platform.
-
MLP Iterative Construction algorithm
Neurocomputing, 1997Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. OxleyAbstract:Abstract This paper presents a novel multi-layer perceptron neural network architecture selection and weight training algorithm for classification problems. The MLP Iterative Construction algorithm (MICA) autonomously constructs an MLP neural network as it trains. Experimental results show the algorithm achieves 100% accuracy on the training data, the same or better generalization accuracies as Backprop on the test data, while using less FLOPS. Moreover, relaxation of the hidden layer nodes improves test set recognition accuracies to be greater than that of Backprop. Furthermore, seeding the Backprop algorithm with the hidden layer weights from MICA is demonstrated. The MICA seeding improves the effectiveness of Backprop and enables Backprop to solve a new class of problems, i.e., problems with areas of low mean-squared error.
-
MLP Iterative Construction algorithm
Applications and Science of Artificial Neural Networks III, 1997Co-Authors: Thomas F. Rathbun, Steven K. Rogers, Martin P. Desimio, Mark E. OxleyAbstract:ABSTRACT TheMLP Iterative Construction Algorithm (MICA) designs a Multi-Layer Perceptron (MLP) neural network as it trains. MICA adds Hidden Layer Nodes (HLNs) one at a time, separating classes on a pair-wise basis, untilthe data is projected into a linear separable space by class. Then MICA trains the Output Layer Nodes (OLNs),which results in an MLP that achieves 100% accuracy on the training data. MICA, like Backprop,'2 produces an MLP that is a minimum mean squared error approximation of the Bayes optimal discriminant function. Moreover,MICA's training technique yields novel feature selection technique and hidden node pruning technique. 1 INTRODUCTION The MLP Iterative Construction Algorithm (MICA) constructs a Multi-Layer Perceptron (MLP) neural net-work for solving classification problems. MICA trains the Hidden Layer Nodes (HLNs), then trains the Output Layer Nodes (OLNs). The resulting MLP network correctly classifies the training data to 100% accuracy. MICA generalization results compare favorably to Backprop's. On difficult training sets such as spiral data, MICA is
D. R. Karakhanyan - One of the best experts on this subject based on the ideXlab platform.
-
Jordan–Schwinger Representations and Factorised Yang–Baxter Operators
2010Co-Authors: D. R. KarakhanyanAbstract:The Construction elements of the factorised form of the Yang–Baxter R operator acting on generic representations of q-deformed sℓ(n + 1) are studied. We rely on the Iterative Construction of such representations by the restricted class of Jordan–Schwinger representations. The latter are formulated explicitly. On this basis the parameter exchange and intertwining operators are derived
-
Iterative Construction of u q s l n 1 representations and lax matrix factorisation
Letters in Mathematical Physics, 2008Co-Authors: S E Derkachov, D. R. Karakhanyan, R Kirschner, P ValinevichAbstract:The Iterative Construction of a generic representation of gl(n + 1) or of the trigonomentric deformation of its enveloping algebra is conveniently formulated in terms of Lax matrices. The Lax matrix of the constructed representation factorises into parts determined by the Lax matrix of a generic representation of the algebra with reduced rank and others appearing in the factorised expression of the Lax matrix of the special Jordan–Schwinger representation.
S E Derkachov - One of the best experts on this subject based on the ideXlab platform.
-
Iterative Construction of eigenfunctions of the monodromy matrix for sl 2 mathbb c magnet
Journal of Physics A, 2014Co-Authors: S E Derkachov, A N ManashovAbstract:Eigenfunctions of the matrix elements of the monodromy matrix provide a convenient basis for studies of spin chain models. We present an Iterative method for constructing the eigenfunctions in the case of spin chains. We derived an explicit integral representation for the eigenfunctions and calculated the corresponding scalar products (Sklyanin's measure).
-
Iterative Construction of eigenfunctions of the monodromy matrix for sl 2 c magnet
arXiv: Mathematical Physics, 2014Co-Authors: S E Derkachov, A N ManashovAbstract:Eigenfunctions of the matrix elements of the monodromy matrix provide a convenient basis for studies of spin chain models. We present an Iterative method for constructing the eigenfunctions in the case of the SL(2,C) spin chains. We derived an explicit integral representation for the eigenfunctions and calculated the corresponding scalar products (Sklyanin's measure).
-
Iterative Construction of u q s l n 1 representations and lax matrix factorisation
Letters in Mathematical Physics, 2008Co-Authors: S E Derkachov, D. R. Karakhanyan, R Kirschner, P ValinevichAbstract:The Iterative Construction of a generic representation of gl(n + 1) or of the trigonomentric deformation of its enveloping algebra is conveniently formulated in terms of Lax matrices. The Lax matrix of the constructed representation factorises into parts determined by the Lax matrix of a generic representation of the algebra with reduced rank and others appearing in the factorised expression of the Lax matrix of the special Jordan–Schwinger representation.