The Experts below are selected from a list of 81 Experts worldwide ranked by ideXlab platform
Bing J. Sheu - One of the best experts on this subject based on the ideXlab platform.
-
A Gaussian Synapse circuit for analog VLSI neural networks
IEEE Transactions on Very Large Scale Integration (VLSI) Systems, 1994Co-Authors: Joongho Choi, Bing J. Sheu, Jui-ming ChangAbstract:Back-propagation neural networks with Gaussian function Synapses have better convergence property over those with linear-multiplying Synapses. In digital simulation, more computing time is spent on Gaussian function evaluation. We present a compact analog Synapse cell which is not biased in the subthreshold region for fully-parallel operation. This cell can approximate a Gaussian function with accuracy around 98% in the ideal case. Device mismatch induced by fabrication process will cause some degradation to this approximation. The Gaussian Synapse cell can also be used in unsupervised learning. Programmability of the proposed Gaussian Synapse cell is achieved by changing the stored Synapse Weight W/sub ji/, the reference current and the sizes of transistors in the differential pair. >
-
A programmable analog VLSI neural network processor for communication receivers
IEEE transactions on neural networks, 1993Co-Authors: J. Choi, S.h. Bang, Bing J. SheuAbstract:An analog VLSI neural network processor was designed and fabricated for communication receiver applications. It does not require prior estimation of the channel characteristics. A powerful channel equalizer was implemented with this processor chip configured as a four-layered perceptron network. The compact Synapse cell is realized with an enhanced wide-range Gilbert multiplier circuit. The output neuron consists of a linear current-to-voltage converter and a sigmoid function generator with a controllable voltage gain. Network training is performed by the modified Kalman neuro-filtering algorithm to speed up the convergence process for intersymbol interference and white Gaussian noise communication channels. The learning process is done in the companion DSP board which also keeps the Synapse Weight for later use of the chip. The VLSI neural network processor chip occupies a silicon area of 4.6 mmx6.8 mm and was fabricated in a 2-mum double-polysilicon CMOS technology. System analysis and experimental results are presented.
-
A Programmable Processor for Analog VLSI Neural Network Communication Receivers
1993Co-Authors: Joongho Choi, S.h. Bang, Bing J. SheuAbstract:An analog VLSI neural network processor was designed and fabricated for communication receiver applications. It does not require a priori estimation of the channel character- istics. A powerful channel equalizer was implemented with this processor chip configured as a four-layered perceptron network. The compact Synapse cell is realized with an enhanced wide- range Gilbert multiplier circuit. The output neuron consists of a linear current-to-voltage converter and a sigmoid function generator with a controllable voltage gain. Network training is performed by the modified Kalman neuro-filtering algorithm to speed up the convergence process for intersymbol interference and white Gaussian noise communication channels. The learning process is done in the companion DSP board which also keeps the Synapse Weight for later use of the chip. The VLSI neural network processor chip occupies a silicon area of 4.6 mm x 6.8 mm and was fabricated in a 2-pm double-polysilicon CMOS technology. System analysis and experimental result are presented.
J. Choi - One of the best experts on this subject based on the ideXlab platform.
-
A programmable analog VLSI neural network processor for communication receivers
IEEE transactions on neural networks, 1993Co-Authors: J. Choi, S.h. Bang, Bing J. SheuAbstract:An analog VLSI neural network processor was designed and fabricated for communication receiver applications. It does not require prior estimation of the channel characteristics. A powerful channel equalizer was implemented with this processor chip configured as a four-layered perceptron network. The compact Synapse cell is realized with an enhanced wide-range Gilbert multiplier circuit. The output neuron consists of a linear current-to-voltage converter and a sigmoid function generator with a controllable voltage gain. Network training is performed by the modified Kalman neuro-filtering algorithm to speed up the convergence process for intersymbol interference and white Gaussian noise communication channels. The learning process is done in the companion DSP board which also keeps the Synapse Weight for later use of the chip. The VLSI neural network processor chip occupies a silicon area of 4.6 mmx6.8 mm and was fabricated in a 2-mum double-polysilicon CMOS technology. System analysis and experimental results are presented.
S.h. Bang - One of the best experts on this subject based on the ideXlab platform.
-
A programmable analog VLSI neural network processor for communication receivers
IEEE transactions on neural networks, 1993Co-Authors: J. Choi, S.h. Bang, Bing J. SheuAbstract:An analog VLSI neural network processor was designed and fabricated for communication receiver applications. It does not require prior estimation of the channel characteristics. A powerful channel equalizer was implemented with this processor chip configured as a four-layered perceptron network. The compact Synapse cell is realized with an enhanced wide-range Gilbert multiplier circuit. The output neuron consists of a linear current-to-voltage converter and a sigmoid function generator with a controllable voltage gain. Network training is performed by the modified Kalman neuro-filtering algorithm to speed up the convergence process for intersymbol interference and white Gaussian noise communication channels. The learning process is done in the companion DSP board which also keeps the Synapse Weight for later use of the chip. The VLSI neural network processor chip occupies a silicon area of 4.6 mmx6.8 mm and was fabricated in a 2-mum double-polysilicon CMOS technology. System analysis and experimental results are presented.
-
A Programmable Processor for Analog VLSI Neural Network Communication Receivers
1993Co-Authors: Joongho Choi, S.h. Bang, Bing J. SheuAbstract:An analog VLSI neural network processor was designed and fabricated for communication receiver applications. It does not require a priori estimation of the channel character- istics. A powerful channel equalizer was implemented with this processor chip configured as a four-layered perceptron network. The compact Synapse cell is realized with an enhanced wide- range Gilbert multiplier circuit. The output neuron consists of a linear current-to-voltage converter and a sigmoid function generator with a controllable voltage gain. Network training is performed by the modified Kalman neuro-filtering algorithm to speed up the convergence process for intersymbol interference and white Gaussian noise communication channels. The learning process is done in the companion DSP board which also keeps the Synapse Weight for later use of the chip. The VLSI neural network processor chip occupies a silicon area of 4.6 mm x 6.8 mm and was fabricated in a 2-pm double-polysilicon CMOS technology. System analysis and experimental result are presented.
Huolin Huang - One of the best experts on this subject based on the ideXlab platform.
-
Effects of W/ WO3-x junction on synaptic characteristics of W/WO3-x/ITO memristor
Physica E: Low-dimensional Systems and Nanostructures, 2021Co-Authors: Yanhong Liu, Chunxia Wang, Yusheng Wang, Huolin HuangAbstract:Abstract Two tungsten/tungsten oxide/Indium–Tin Oxide (W/WO3-x/ITO) memristors were fabricated for studying comparatively the effects of W/WO3-x junction on their synaptic performances, in which one is ohmic (Ohm-type device) and the other is rectified (Rec-type device). The characterization results show that Ohm-type device exhibits rich synaptic properties, including the controllable Synapse Weight update by adjusting the input pulse amplitude, interval, and number, as well as the transformation from the paired-pulse facilitation (PPF) to the paired-pulse depression (PPD) by only changing the pulse width. However, the Rec-type device manifests the nonconventional synaptic behavior and the bowknot-shaped I–V curves. The fitting of I–V curves by the various carrier transport modes demonstrates the bulk mechanism for the Ohm-type device and Schottky mechanism for the Rec-type device. Therefore, we suggest that oxygen vacancy injection and migration in WO3-x layer lead to the presence of synaptic properties in the Ohm-type device, while the rectified W/WO3-x junction and the variable conductance of WO3-x layer lead to the nonconventional synaptic properties and bowknot-shaped I–V curves. This report revealed the complexity of memristors, and emphasizing the necessity to carefully select the material combination and design the device structures for realizing the synaptic properties.
Dmitri B. Chklovskii - One of the best experts on this subject based on the ideXlab platform.
-
Neuronal Circuits Underlying Persistent Representations Despite Time Varying Activity
Current biology : CB, 2012Co-Authors: Shaul Druckmann, Dmitri B. ChklovskiiAbstract:Our brains are capable of remarkably stable stimulus representations despite time-varying neural activity. For instance, during delay periods in working memory tasks, while stimuli are represented in working memory, neurons in the prefrontal cortex, thought to support the memory representation, exhibit time-varying neuronal activity. Since neuronal activity encodes the stimulus, its time-varying dynamics appears to be paradoxical and incompatible with stable network stimulus representations. Indeed, this finding raises a fundamental question: can stable representations only be encoded with stable neural activity, or, its corollary, is every change in activity a sign of change in stimulus representation? Here we explain how different time-varying representations offered by individual neurons can be woven together to form a coherent, time-invariant, representation. Motivated by two ubiquitous features of the neocortex-redundancy of neural representation and sparse intracortical connections-we derive a network architecture that resolves the apparent contradiction between representation stability and changing neural activity. Unexpectedly, this network architecture exhibits many structural properties that have been measured in cortical sensory areas. In particular, we can account for few-neuron motifs, Synapse Weight distribution, and the relations between neuronal functional properties and connection probability. We show that the intuition regarding network stimulus representation, typically derived from considering single neurons, may be misleading and that time-varying activity of distributed representation in cortical circuits does not necessarily imply that the network explicitly encodes time-varying properties. Copyright © 2012 Elsevier Ltd. All rights reserved.