The Experts below are selected from a list of 91311 Experts worldwide ranked by ideXlab platform
Wilson Rosa De Oliveira - One of the best experts on this subject based on the ideXlab platform.
-
quantum Perceptron over a field and neural network architecture selection in a quantum computer
Neural Networks, 2016Co-Authors: Adenilton J Da Silva, Teresa B Ludermir, Wilson Rosa De OliveiraAbstract:In this work, we propose a quantum neural network named quantum Perceptron over a field (QPF). Quantum computers are not yet a reality and the models and algorithms proposed in this work cannot be simulated in actual (or classical) computers. QPF is a direct generalization of a classical Perceptron and solves some drawbacks found in previous models of quantum Perceptrons. We also present a learning algorithm named Superposition based Architecture Learning algorithm (SAL) that optimizes the neural network weights and architectures. SAL searches for the best architecture in a finite set of neural network architectures with linear time over the number of patterns in the training set. SAL is the first learning algorithm to determine neural network architectures in polynomial time. This speedup is obtained by the use of quantum parallelism and a non-linear quantum operator.
Francesco Petruccione - One of the best experts on this subject based on the ideXlab platform.
-
Simulating a Perceptron on a quantum computer
Physics Letters Section A: General Atomic and Solid State Physics, 2015Co-Authors: Maria Schuld, Ilya Sinayskiy, Francesco PetruccioneAbstract:Perceptrons are the basic computational unit of artificial neural networks, as they model the activation mechanism of an output neuron due to incoming signals from its neighbours. As linear classifiers, they play an important role in the foundations of machine learning. In the context of the emerging field of quantum machine learning, several attempts have been made to develop a corresponding unit using quantum information theory. Based on the quantum phase estimation algorithm, this paper introduces a quantum Perceptron model imitating the step-activation function of a classical Perceptron. This scheme requires resources in O(n) (where n is the size of the input) and promises efficient applications for more complex structures such as trainable quantum neural networks.
S. Mukhopadhyay - One of the best experts on this subject based on the ideXlab platform.
-
A polynomial time algorithm for generating neural networks for classification problems
[Proceedings 1992] IJCNN International Joint Conference on Neural Networks, 1Co-Authors: Asim Roy, S. MukhopadhyayAbstract:A novel polynomial time algorithm for the construction and training of multilayer Perceptrons for classification problems is presented. It uses linear programming models to generate incrementally the hidden layer in a restricted higher-order Perceptron. The polynomial time complexity of the method is proven and computational results are provided for some well-known problems. In all cases, very small nets were created compared to those reported previously. >
Adenilton J Da Silva - One of the best experts on this subject based on the ideXlab platform.
-
quantum Perceptron over a field and neural network architecture selection in a quantum computer
Neural Networks, 2016Co-Authors: Adenilton J Da Silva, Teresa B Ludermir, Wilson Rosa De OliveiraAbstract:In this work, we propose a quantum neural network named quantum Perceptron over a field (QPF). Quantum computers are not yet a reality and the models and algorithms proposed in this work cannot be simulated in actual (or classical) computers. QPF is a direct generalization of a classical Perceptron and solves some drawbacks found in previous models of quantum Perceptrons. We also present a learning algorithm named Superposition based Architecture Learning algorithm (SAL) that optimizes the neural network weights and architectures. SAL searches for the best architecture in a finite set of neural network architectures with linear time over the number of patterns in the training set. SAL is the first learning algorithm to determine neural network architectures in polynomial time. This speedup is obtained by the use of quantum parallelism and a non-linear quantum operator.
Maria Schuld - One of the best experts on this subject based on the ideXlab platform.
-
Simulating a Perceptron on a quantum computer
Physics Letters Section A: General Atomic and Solid State Physics, 2015Co-Authors: Maria Schuld, Ilya Sinayskiy, Francesco PetruccioneAbstract:Perceptrons are the basic computational unit of artificial neural networks, as they model the activation mechanism of an output neuron due to incoming signals from its neighbours. As linear classifiers, they play an important role in the foundations of machine learning. In the context of the emerging field of quantum machine learning, several attempts have been made to develop a corresponding unit using quantum information theory. Based on the quantum phase estimation algorithm, this paper introduces a quantum Perceptron model imitating the step-activation function of a classical Perceptron. This scheme requires resources in O(n) (where n is the size of the input) and promises efficient applications for more complex structures such as trainable quantum neural networks.