Synaptic Noise

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Alain Destexhe - One of the best experts on this subject based on the ideXlab platform.

  • noisy dendrites models of dendritic integration in vivo
    2014
    Co-Authors: Alain Destexhe, Michelle Rudolphlilith
    Abstract:

    While dendritic processing has been well characterized in vitro, there is little experimental data and models available about the integrative properties of dendrites in vivo. Here, we review existing computational models to infer the dendritic processing of neocortical pyramidal neurons in vivo. We start by summarizing experimental measurements of the “high-conductance states” of cortical neurons in vivo. Next, we show models predicting that, in such states, the responsiveness of cortical neurons should be greatly enhanced, in particular due to the presence of high-amplitude fluctuations (“Synaptic Noise”). We infer that in dendrites this effect should be particularly strong, leading to the spontaneous activation of dendritic spikes. The presence of Noise in dendrites also enhances spike propagation. We show that opposite distance dependencies of spike initiation and propagation result in roughly location-independent Synaptic efficacies. In addition, in high-conductance states, dendrites display sharper temporal processing capabilities. Thus, we conclude that noisy active dendrites behave more “democratically,” and that dendrites should have enhanced processing capabilities in vivo.

  • analyzing Synaptic Noise
    2012
    Co-Authors: Alain Destexhe, Michelle Rudolphlilith
    Abstract:

    As we have shown in the previous chapters, specifically in Chaps. 3 and 5, Synaptic Noise leads to marked changes in the integrative properties and response behavior of individual neurons. Following mathematical formulations of Synaptic Noise (Chap. 7), we derive in the present chapter a new class of stochastic methods to analyze Synaptic Noise. These methods consider the membrane potential as a stochastic process. Specific applications of these methods are presented in Chap. 9.

  • models of Synaptic Noise
    2012
    Co-Authors: Alain Destexhe, Michelle Rudolphlilith
    Abstract:

    In this chapter, we build models of “Synaptic Noise” in cortical neurons based on the experimental characterization reviewed in Chap. 3. We first consider detailed models, which incorporate a precise morphological representation of the cortical neuron and its synapses. Next, we review simplified models of SynapticNoise.” Both type of models will be used in the next chapters to investigate the integrative properties of neurons in the presence of Synaptic Noise.

  • recreating Synaptic Noise using dynamic clamp
    2012
    Co-Authors: Alain Destexhe, Michelle Rudolphlilith
    Abstract:

    This chapter will cover one of the most promising and elegant approach for studying the effect of Synaptic Noise on neurons: the dynamic-clamp injection of artificial conductance-based Synaptic Noise.We start by an introduction to the dynamic-clamp technique, and we next describe the “re-creation” of in vivo-like activity states in neurons maintained in vitro. We then overview the consequences of Synaptic Noise on the integrative properties of neurons, as found by dynamic-clamp experiments.

  • the mathematics of Synaptic Noise
    2012
    Co-Authors: Alain Destexhe, Michelle Rudolphlilith
    Abstract:

    The previous chapters of this book have focused mostly on studies assessing and characterizing Synaptic Noise under a variety of experimental conditions, and on evaluating its role in shaping neural dynamics through computational models. Although detailed biophysical models of neurons in vivo (see Sect. 4.2) remain, so far, out of reach for a mathematically more rigorous approach, the introduced simplified models (see Sects. 4.3 and 4.4), at least partially, allow for an analytical treatment. The latter can be used to complement experimental and computational studies and, therefore, provide a deeper understanding of neuronal dynamics under noisy conditions. Moreover, a mathematical treatment can also be used to provide unprecedented characterization of Synaptic Noise and how it affects spiking activity. This will be the subject of this and the forthcoming chapters.

Masatoshi Shiino - One of the best experts on this subject based on the ideXlab platform.

  • thouless anderson palmer equation for associative memory neural network with Synaptic Noise
    Physica E-low-dimensional Systems & Nanostructures, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Abstract We study the effects of temporal fluctuations in Synaptic couplings on the properties of analog neural networks. Since no energy concept exists in networks with such couplings, the use of the replica method does not make sense. On the other hand, the self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus plays an important role for studying such networks. To apply the SCSNA to stochastic networks, it is necessary to define the deterministic networks equivalent to the original stochastic ones, which are given by the Thouless–Anderson–Palmer (TAP) equations. Therefore the TAP equation is of interest for studying the statistical properties of the networks with Synaptic Noise, while such study is very few. In this paper, we show the TAP equation together with a set of order parameter equations for such networks by using both the cavity method and the SCSNA.

  • the thouless anderson palmer equation for an analogue neural network with temporally fluctuating white Synaptic Noise
    Journal of Physics A, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Effects of Synaptic Noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analogue neural networks with temporally fluctuating Synaptic Noise, which is assumed to be white Noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic networks requires the knowledge of the Thouless–Anderson–Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very less, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with Synaptic Noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.

  • thouless anderson palmer equation for analog neural network with temporally fluctuating white Synaptic Noise
    arXiv: Disordered Systems and Neural Networks, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Effects of Synaptic Noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analog neural networks with temporally fluctuating Synaptic Noise, which is assumed to be white Noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic network requires the knowledge of the Thouless-Anderson-Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very few, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with Synaptic Noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.

P B Matthews - One of the best experts on this subject based on the ideXlab platform.

  • properties of human motoneurones and their Synaptic Noise deduced from motor unit recordings with the aid of computer modelling
    Journal of Physiology-paris, 1999
    Co-Authors: P B Matthews
    Abstract:

    Abstract This paper reviews two new facets of the behaviour of human motoneurones; these were demonstrated by modelling combined with analysis of long periods of low-frequency tonic motor unit firing (sub-primary range). 1) A novel transformation of the interval histogram has shown that the effective part of the membrane's post-spike voltage trajectory is a segment of an exponential (rather than linear), with most spikes being triggered by Synaptic Noise before the mean potential reaches threshold. The curvature of the motoneurone's trajectory affects virtually all measures of its behaviour and response to stimulation. The ‘trajectory’ is measured from threshold, and so includes any changes in threshold during the interspike interval. 2) A novel rhythmic stimulus (amplitude-modulated pulsed vibration) has been used to show that the motoneurone produces appreciable phase-advance during sinusoidal excitation. At low frequencies, the advance increases with rising stimulus frequency but then, slightly below the motoneurones mean firing rate, it suddenly becomes smaller. The gain has a maximum for stimuli at the mean firing rate (the ‘carrier’). Such behaviour is functionally important since it affects the motoneurone's response to any rhythmic input, whether generated peripherally by the receptors (as in tremor) or by the CNS (as with cortical oscillations). Low mean firing rates favour tremor, since the high gain and reduced phase advance at the ‘carrier’ reduce the stability of the stretch reflex.

  • relationship of firing intervals of human motor units to the trajectory of post spike after hyperpolarization and Synaptic Noise
    The Journal of Physiology, 1996
    Co-Authors: P B Matthews
    Abstract:

    1. Interspike interval distributions from human motor units of a variety of muscles were analysed to assess the role of Synaptic Noise in excitation. The time course of the underlying post-spike after-hyperpolarization (AHP) was deduced by applying a specially developed transform to the interval data. Different firing rates were studied both by varying the firing voluntarily, and by selecting subpopulations of spikes for a given firing rate from long recordings with slight variations in frequency. 2. At low firing rates the interval histograms had an exponential tail. Thus at long intervals, the motoneurone was randomly excited by Noise and its post-spike AHP was complete. This contrasts with the firing produced by intracellular current injection in the cat, when the membrane potential increases linearly until threshold is reached. The interval histogram was therefore analysed with the aid of a model of Synaptic excitation to deduce the mean ‘trajectory’ of membrane voltage in the last part of the interspike interval. 3. The computer model, described in the Appendix, was used to determine the effect of the mean level of membrane potential on the probability of a spike being excited, per unit time, during an on-going interspike interval. All variables were treated as voltages, with Synaptic Noise simulated by time-smoothed Gaussian Noise. This enabled an interval distribution to be transformed into a segment of the underlying trajectory of the membrane potential; the potential was expressed in terms of the Noise amplitude and the spike threshold. 4. At low firing rates, the equilibrium value of the membrane voltage trajectory lay well below threshold; the deviation typically corresponded to the standard deviation of the Noise or more. The Noise standard deviation was estimated to be about 2 mV. 5. With increasing mean firing rate, the near-threshold portion of the trajectory obtainable from the histogram occurred earlier, was steeper and rose to a higher level. Trajectories for different firing rates fell on the same curve after shifting them vertically by varying amounts. The curve was taken to represent the AHP of the motoneurone and was closely exponential. The shift of the trajectory gave its mean Synaptic drive. The duration of the AHP varied between units and was longer than average for units from soleus muscle. 6. Further modelling showed that summation of Noise with the AHP can explain the well-known changes in discharge variability that occur as firing rate increases. 7. It is concluded that Synaptic Noise plays a major role in the excitation of tonically firing human motoneurones and that the Noiseless motoneurone with a linear trajectory provides an inadequate model for the conscious human. This is of interest in relation to various standard measures of human motor unit activity such as short-term synchronization.

Akihisa Ichiki - One of the best experts on this subject based on the ideXlab platform.

  • thouless anderson palmer equation for associative memory neural network with Synaptic Noise
    Physica E-low-dimensional Systems & Nanostructures, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Abstract We study the effects of temporal fluctuations in Synaptic couplings on the properties of analog neural networks. Since no energy concept exists in networks with such couplings, the use of the replica method does not make sense. On the other hand, the self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus plays an important role for studying such networks. To apply the SCSNA to stochastic networks, it is necessary to define the deterministic networks equivalent to the original stochastic ones, which are given by the Thouless–Anderson–Palmer (TAP) equations. Therefore the TAP equation is of interest for studying the statistical properties of the networks with Synaptic Noise, while such study is very few. In this paper, we show the TAP equation together with a set of order parameter equations for such networks by using both the cavity method and the SCSNA.

  • the thouless anderson palmer equation for an analogue neural network with temporally fluctuating white Synaptic Noise
    Journal of Physics A, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Effects of Synaptic Noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analogue neural networks with temporally fluctuating Synaptic Noise, which is assumed to be white Noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic networks requires the knowledge of the Thouless–Anderson–Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very less, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with Synaptic Noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.

  • thouless anderson palmer equation for analog neural network with temporally fluctuating white Synaptic Noise
    arXiv: Disordered Systems and Neural Networks, 2007
    Co-Authors: Akihisa Ichiki, Masatoshi Shiino
    Abstract:

    Effects of Synaptic Noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analog neural networks with temporally fluctuating Synaptic Noise, which is assumed to be white Noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-Noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic network requires the knowledge of the Thouless-Anderson-Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very few, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with Synaptic Noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.

Henri Korn - One of the best experts on this subject based on the ideXlab platform.

  • Synaptic efficacy and the transmission of complex firing patterns between neurons
    Journal of Neurophysiology, 2000
    Co-Authors: Philippe Faure, Daniel Kaplan, Henri Korn
    Abstract:

    In central neurons, the summation of inputs from preSynaptic cells combined with the unreliability of Synaptic transmission produces incessant variations of the membrane potential termed Synaptic Noise (SN). These fluctuations, which depend on both the unpredictable timing of afferent activities and quantal variations of postSynaptic potentials, have defied conventional analysis. We show here that, when applied to SN recorded from the Mauthner (M) cell of teleosts, a simple method of nonlinear analysis reveals previously undetected features of this signal including hidden periodic components. The phase relationship between these components is compatible with the notion that the temporal organization of events comprising this Noise is deterministic rather than random and that it is generated by preSynaptic interneurons behaving as coupled periodic oscillators. Furthermore a model of the preSynaptic network shows how SN is shaped both by activities in incoming inputs and by the distribution of their Synaptic weights expressed as mean quantal contents of the activated synapses. In confirmation we found experimentally that long-term tetanic potentiation (LTP), which selectively increases some of these Synaptic weights, permits oscillating temporal patterns to be transmitted more effectively to the postSynaptic cell. Thus the probabilistic nature of transmitter release, which governs the strength of synapses, may be critical for the transfer of complex timing information within neuronal assemblies.

  • a nonrandom dynamic component in the Synaptic Noise of a central neuron
    Proceedings of the National Academy of Sciences of the United States of America, 1997
    Co-Authors: Philippe Faure, Henri Korn
    Abstract:

    Continuous segments of Synaptic Noise were recorded in vivo from teleost Mauthner cells and were studied with the methods of nonlinear analysis. As in many central neurons, this ongoing activity is dominated by consecutive inhibitory postSynaptic potentials. Recurrence plots and first or third order Poincare maps combined with surrogate shuffling revealed nonrandom patterns consistent with the notion that Synaptic Noise is a continuously varying mixture of periodic and chaotic phases. Chaos was further demonstrated by the occurrence of unstable periodic orbits. The nonrandom component of the Noise is reproducibly and persistently reduced when the level of background sound, a natural stimulus for networks afferent to the Mauthner cell, is briefly elevated. These data are consistent with a model involving a reciprocally connected inhibitory network, preSynaptic to the Mauthner cell and its intrinsic properties. The presence of chaos in the inhibitory Synaptic Noise that regulates the excitability of the Mauthner cell and its sensitivity to external stimuli suggests that it modulates this neuron’s function, namely to trigger a fast escape motor reaction following unexpected sensory information.

  • a nonrandom dynamic component in the Synaptic Noise of a central neuron chaosyrecurrence plotyunstable periodic orbityauditory inputsyattention
    1997
    Co-Authors: Philippe Faure, Henri Korn
    Abstract:

    Continuous segments of Synaptic Noise were recorded in vivo from teleost Mauthner cells and were studied with the methods of nonlinear analysis. As in many central neurons, this ongoing activity is dominated by consecutive inhibitory postSynaptic potentials. Recurrence plots and first or third order Poincaremaps combined with surrogate shuf- f ling revealed nonrandom patterns consistent with the notion that Synaptic Noise is a continuously varying mixture of periodic and chaotic phases. Chaos was further demonstrated by the occurrence of unstable periodic orbits. The nonrandom component of the Noise is reproducibly and persistently reduced when the level of background sound, a natural stimulus for networks afferent to the Mauthner cell, is brief ly elevated. These data are consistent with a model involving a reciprocally connected inhibitory network, preSynaptic to the Mauthner cell and its intrinsic properties. The presence of chaos in the inhibitory Synaptic Noise that regulates the excitability of the Mauthner cell and its sensitivity to external stimuli suggests that it modulates this neuron's function, namely to trigger a fast escape motor reaction following unexpected sensory information.

  • Synaptic Noise and multiquantal release at dendritic synapses
    Journal of Neurophysiology, 1993
    Co-Authors: Henri Korn, F Bausela, Stephane Charpier, Donald S Faber
    Abstract:

    1. The quantal nature of inhibitory Synaptic Noise recorded intracellularly from the lateral dendrite of the goldfish Mauthner cell was studied, using new detection and measurement procedures that eliminate operator intervention. In addition, we employed an analytical algorithm, not previously applied to this problem, which treats composite amplitude distributions as mixtures of gaussians of unknown separations and variances. 2. As in the soma of this neuron, the dendritic inhibitory Noise is quantal, with the exception that in the dendrite multiple equally spaced classes may persist in the presence of tetrodotoxin (TTX), an observation that may be correlated with the finding that the inhibitory afferents at this level often contain more than one release site. The validity of the analysis was confirmed by superfusing with saline containing low calcium and high magnesium, which reduces composite histograms that are gaussian mixtures to a single class, equal in amplitude to that of the first component detected in the control. 3. These results suggest that spontaneous exocytotic events may be synchronized at adjacent active zones within single terminals and that lowering the probability of release by reducing calcium may then be a more effective method for isolating single miniature events than is TTX.