Perceptrons

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 10242 Experts worldwide ranked by ideXlab platform

Keshab K Parhi - One of the best experts on this subject based on the ideXlab platform.

  • molecular and dna artificial neural networks via fractional coding
    IEEE Transactions on Biomedical Circuits and Systems, 2020
    Co-Authors: Xingyi Liu, Keshab K Parhi
    Abstract:

    This article considers implementation of artificial neural networks (ANNs) using molecular computing and DNA based on fractional coding . Prior work had addressed molecular two-layer ANNs with binary inputs and arbitrary weights. In prior work using fractional coding, a simple molecular perceptron that computes sigmoid of scaled weighted sum of the inputs was presented where the inputs and the weights lie between $[-1,1]$ . Even for computing the perceptron, the prior approach suffers from two major limitations. First, it cannot compute the sigmoid of the weighted sum, but only the sigmoid of the scaled weighted sum. Second, many machine learning applications require the coefficients to be arbitrarily positive and negative numbers that are not bounded between $[-1,1]$ ; such numbers cannot be handled by the prior perceptron using fractional coding. This paper makes four contributions. First molecular Perceptrons that can handle arbitrary weights and can compute sigmoid of the weighted sums are presented. Thus, these molecular Perceptrons are ideal for regression applications and multi-layer ANNs. A new molecular divider is introduced and is used to compute $sigmoid(ax)$ where $a>1$ . Second, based on fractional coding, a molecular artificial neural network (ANN) with one hidden layer is presented. Third, a trained ANN classifier with one hidden layer from seizure prediction application from electroencephalogram is mapped to molecular reactions and DNA and their performances are presented. Fourth, molecular activation functions for rectified linear unit (ReLU) and softmax are also presented.

Youngah Park - One of the best experts on this subject based on the ideXlab platform.

  • generalization in a perceptron with a sigmoid transfer function
    International Joint Conference on Neural Network, 1993
    Co-Authors: Sanghun Ha, Kukjin Kang, Chulan Kwon, Jonghoon Oh, Youngah Park
    Abstract:

    Learning of layered neural networks is studied using the methods of statistical mechanics. Networks are trained from examples using the Gibbs algorithm. We focus on the generalization curve, i.e. the average generalization error as a function of the number of the examples. We consider perceptron learning with a sigmoid transfer function. Ising Perceptrons, with weights constrained to be discrete, exhibit sudden learning at low temperatures within the annealed approximation. There is a first order transition from a state of poor generalization to a state of perfect generalization. When the transfer function is smooth, the first order transition occurs only at low temperatures. The transition becomes continuous at high temperatures. When the transfer function is steep, the first order transition line is extended to the higher temperature. The analytic results show a good agreement with the computer simulations.

William M P Klein - One of the best experts on this subject based on the ideXlab platform.

  • Risk perceptions and health behavior
    Current opinion in psychology, 2015
    Co-Authors: Rebecca A. Ferrer, William M P Klein
    Abstract:

    Risk perceptions - or an individual's perceived susceptibility to a threat - are a key component of many health behavior change theories. Risk perceptions are often targeted in health behavior change interventions, and recent meta-analytic evidence suggests that interventions that successfully engage and change risk perceptions produce subsequent increases in health behaviors. Here, we review recent literature on risk perceptions and health behavior, including research on the formation of risk perceptions, types of risk perceptions (including deliberative, affective, and experiential), accuracy of risk perceptions, and associations and interactions among types of risk perceptions. Taken together, existing research suggests that disease risk perceptions are a critical determinant of health behavior, although the nature of the association among risk perceptions and health behavior may depend on the profile of different types of risk perceptions and the accuracy of such perceptions.

Xingyi Liu - One of the best experts on this subject based on the ideXlab platform.

  • molecular and dna artificial neural networks via fractional coding
    IEEE Transactions on Biomedical Circuits and Systems, 2020
    Co-Authors: Xingyi Liu, Keshab K Parhi
    Abstract:

    This article considers implementation of artificial neural networks (ANNs) using molecular computing and DNA based on fractional coding . Prior work had addressed molecular two-layer ANNs with binary inputs and arbitrary weights. In prior work using fractional coding, a simple molecular perceptron that computes sigmoid of scaled weighted sum of the inputs was presented where the inputs and the weights lie between $[-1,1]$ . Even for computing the perceptron, the prior approach suffers from two major limitations. First, it cannot compute the sigmoid of the weighted sum, but only the sigmoid of the scaled weighted sum. Second, many machine learning applications require the coefficients to be arbitrarily positive and negative numbers that are not bounded between $[-1,1]$ ; such numbers cannot be handled by the prior perceptron using fractional coding. This paper makes four contributions. First molecular Perceptrons that can handle arbitrary weights and can compute sigmoid of the weighted sums are presented. Thus, these molecular Perceptrons are ideal for regression applications and multi-layer ANNs. A new molecular divider is introduced and is used to compute $sigmoid(ax)$ where $a>1$ . Second, based on fractional coding, a molecular artificial neural network (ANN) with one hidden layer is presented. Third, a trained ANN classifier with one hidden layer from seizure prediction application from electroencephalogram is mapped to molecular reactions and DNA and their performances are presented. Fourth, molecular activation functions for rectified linear unit (ReLU) and softmax are also presented.

Francisco J Roperopelaez - One of the best experts on this subject based on the ideXlab platform.

  • on the biological plausibility of artificial metaplasticity learning algorithm
    Neurocomputing, 2013
    Co-Authors: Diego Andina, Francisco J Roperopelaez
    Abstract:

    The training algorithm studied in this paper is inspired by the biological metaplasticity property of neurons. During the training phase, the Artificial Metaplasticity Learning Algorithm could be considered a new probabilistic version of the presynaptic rule, as during this phase the algorithm assigns higher values for updating the weights in the less probable activations than in the ones with higher probability. The algorithm is proposed for Artificial Neural Networks in general, although results at the moment have only been implemented and tested for Multilayer Perceptrons. Tested on different multidisciplinary applications, experiments show a much more efficient training, improving also Multilayer Perceptron results till the performance of the best systems in the state of the art, systems that usually are much more complex.