Synaptic Scaling

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 2046 Experts worldwide ranked by ideXlab platform

Christian Tetzlaff - One of the best experts on this subject based on the ideXlab platform.

  • The Use of Hebbian Cell Assemblies for Nonlinear Computation.
    Scientific Reports, 2015
    Co-Authors: Christian Tetzlaff, Sakyasingha Dasgupta, Tomas Kulvicius, Florentin Worgotter
    Abstract:

    When learning a complex task our nervous system self-organizes large groups of neurons into coherent dynamic activity patterns. During this, a network with multiple, simultaneously active, and computationally powerful cell assemblies is created. How such ordered structures are formed while preserving a rich diversity of neural dynamics needed for computation is still unknown. Here we show that the combination of Synaptic plasticity with the slower process of Synaptic Scaling achieves (i) the formation of cell assemblies and (ii) enhances the diversity of neural dynamics facilitating the learning of complex calculations. Due to Synaptic Scaling the dynamics of different cell assemblies do not interfere with each other. As a consequence, this type of self-organization allows executing a difficult, six degrees of freedom, manipulation task with a robot where assemblies need to learn computing complex non-linear transforms and – for execution – must cooperate with each other without interference. This mechanism, thus, permits the self-organization of computationally powerful sub-structures in dynamic networks for behavior control.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    PLOS Computational Biology, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term Synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and Synaptic differentiation is simultaneously achieved remains unclear. Here we show that Synaptic Scaling – a slow process usually associated with the maintenance of activity homeostasis – combined with Synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and Scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that Scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    BMC Neuroscience, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory formation in the nervous system relies on mechanisms acting on time scales from minutes, for long-term Synaptic plasticity [1], to days, for memory consolidation [2]. During such processes, the neural network distinguishes synapses relevant for forming a long-term storage (LTS), which are consolidated, from synapses of short-term storage (STS), which fade. How time scale integration and Synaptic differentiation is simultaneously achieved within one neural circuit remains unclear. We show in simulations and mean-field analyses that Synaptic Scaling [3] - a slow process usually associated with the maintenance of activity homeostasis - combined with the faster processes of Synaptic plasticity simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. A network intrinsic bifurcation enables this separation as this bifurcation induces different response properties of previously learned cell assemblies due to external memory reactivations. These reactivations could be associated with "sleep-like" activations as, for instance, sharp-wave ripples during slow-wave sleep [4,5]. Additionally, the interaction between plasticity and Scaling provides an explanation for an established paradox where memory consolidation and destabilization critically depends on the exact order of learning and recall. This enables us to reproduce human-psychophysical results [6] on the apparently paradoxical effect of memory destabilization due to memory recall [7]. However, other experimentalists failed to reproduce this memory destabilization effect (e.g., [8]). This ambivalence can be explained by the here proposed bifurcation scenario as the initial conditions and exact timings of recall and learning determine the transition between consolidation and destabilization. Thus, the dynamics of our model yield the fact that memory - similar to the real systems - remains susceptible to perturbations and has to be repeatedly consolidated [2] which could happen during sleep [4,5]. To achieve a final stabilization of memory, systems consolidation, which also begins during sleep [4], performs a transition from a dynamic to a more static memory representation by transferring the information to the neocortex [2]. The processes suggested here are capable of repeatedly (re)consolidating LTS-synapses, while STS-candidates fade. This may thus essentially contribute to providing a stable substrate for systems consolidation.

  • analysis of Synaptic Scaling in combination with hebbian plasticity in several simple networks
    Frontiers in Computational Neuroscience, 2012
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Conventional Synaptic plasticity in combination with Synaptic Scaling is a biologically plausible plasticity rule that guides the development of synapses towards stability. Here we analyze the development of Synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and Scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing Synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with Scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

  • Synaptic Scaling in combination with many generic plasticity mechanisms stabilizes circuit connectivity
    Frontiers in Computational Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Synaptic Scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional Synaptic plasticity in the form of long term depression and potentiation, this changes the Synaptic patterns in a network, ensuring diverse, functionally relevant, stable and input-dependent connectivity. How Synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze Synaptic Scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and Synaptic Scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models, which reproduce experimentally observed Synaptic distributions as well as the observed Synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with Scaling generates globally stable, input-controlled Synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, Synaptic Scaling can robustly yield neuronal circuits with high Synaptic diversity, which potentially allows in a more robust way for the dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. This, could be the basis for the learning of structured behavior even in initially random networks.

Giacomo Indiveri - One of the best experts on this subject based on the ideXlab platform.

  • an ultralow leakage Synaptic Scaling homeostatic plasticity circuit with configurable time scales up to 100 ks
    IEEE Transactions on Biomedical Circuits and Systems, 2017
    Co-Authors: Ning Qiao, Chiara Bartolozzi, Giacomo Indiveri
    Abstract:

    Homeostatic plasticity is a stabilizing mechanism commonly observed in real neural systems that allows neurons to maintain their activity around a functional operating point. This phenomenon can be used in neuromorphic systems to compensate for slowly changing conditions or chronic shifts in the system configuration. However, to avoid interference with other adaptation or learning processes active in the neuromorphic system, it is important that the homeostatic plasticity mechanism operates on time scales that are much longer than conventional Synaptic plasticity ones. In this paper we present an ultralow leakage circuit, integrated into an automatic gain control scheme, that can implement the Synaptic Scaling homeostatic process over extremely long time scales. Synaptic Scaling consists in globally Scaling the Synaptic weights of all synapses impinging onto a neuron maintaining their relative differences, to preserve the effects of learning. The scheme we propose controls the global gain of analog log-domain synapse circuits to keep the neuron's average firing rate constant around a set operating point, over extremely long time scales. To validate the proposed scheme, we implemented the ultralow leakage Synaptic Scaling homeostatic plasticity circuit in a standard 0.18  $\mu$ m complementary metal-oxide-semiconductor process, and integrated it in an array of dynamic synapses connected to an adaptive integrate and fire neuron. The circuit occupies a silicon area of 84  $\mu$ m $\times$ 22  $\mu$ m and consumes approximately 10.8 nW with a 1.8 V supply voltage. We present experimental results from the homeostatic circuit and demonstrate how it can be configured to exhibit time scales of up to 100 ks, thanks to a controllable leakage current that can be scaled down to 0.45 aA (2.8 electrons per second).

  • An Ultralow Leakage Synaptic Scaling Homeostatic Plasticity Circuit With Configurable Time Scales up to 100 ks
    IEEE Transactions on Biomedical Circuits and Systems, 2017
    Co-Authors: Ning Qiao, Chiara Bartolozzi, Giacomo Indiveri
    Abstract:

    Homeostatic plasticity is a stabilizing mechanism commonly observed in real neural systems that allows neurons to maintain their activity around a functional operating point. This phenomenon can be used in neuromorphic systems to compensate for slowly changing conditions or chronic shifts in the system configuration. However, to avoid interference with other adaptation or learning processes active in the neuromorphic system, it is important that the homeostatic plasticity mechanism operates on time scales that are much longer than conventional Synaptic plasticity ones. In this paper we present an ultralow leakage circuit, integrated into an automatic gain control scheme, that can implement the Synaptic Scaling homeostatic process over extremely long time scales. Synaptic Scaling consists in globally Scaling the Synaptic weights of all synapses impinging onto a neuron maintaining their relative differences, to preserve the effects of learning. The scheme we propose controls the global gain of analog log-domain synapse circuits to keep the neuron's average firing rate constant around a set operating point, over extremely long time scales. To validate the proposed scheme, we implemented the ultralow leakage Synaptic Scaling homeostatic plasticity circuit in a standard 0.18 μm complementary metal-oxide-semiconductor process, and integrated it in an array of dynamic synapses connected to an adaptive integrate and fire neuron. The circuit occupies a silicon area of 84 μm × 22 μm and consumes approximately 10.8 nW with a 1.8 V supply voltage. We present experimental results from the homeostatic circuit and demonstrate how it can be configured to exhibit time scales of up to 100 ks, thanks to a controllable leakage current that can be scaled down to 0.45 aA (2.8 electrons per second).

  • Ultra low leakage Synaptic Scaling circuits for implementing homeostatic plasticity in neuromorphic architectures
    2014 IEEE International Symposium on Circuits and Systems (ISCAS), 2014
    Co-Authors: Giovanni Rovere, Qiao Ning, Chiara Bartolozzi, Giacomo Indiveri
    Abstract:

    Homeostatic plasticity is a property of biological neural circuits that stabilizes their neuronal firing rates in face of input changes or environmental variations. Synaptic Scaling is a particular homeostatic mechanism that acts at the level of the single neuron over long time scales, by changing the gain of all its afferent synapses to maintain the neuron's mean firing within proper operating bounds. In this paper we present ultra low leakage analog circuits that allow the integration of compact integrated filters in multi-neuron chips, able to achieve time constants of the order of hundreds of seconds, and describe automatic gain control circuits that when interfaced to neuromorphic neuron and synapse circuits implement faithful models of biologically realistic Synaptic Scaling mechanisms. We present simulation results of the low leakage circuits and describe the control circuits that have been designed for a neuromorphic multi-neuron chip, fabricated using a standard 180nm CMOS process.

  • ISCAS - Ultra Low Leakage Synaptic Scaling Circuits for Implementing Homeostatic Plasticity in Neuromorphic Architectures
    2014 IEEE International Symposium on Circuits and Systems (ISCAS), 2014
    Co-Authors: Giovanni Rovere, Qiao Ning, Chiara Bartolozzi, Giacomo Indiveri
    Abstract:

    Homeostatic plasticity is a property of biological neural circuits that stabilizes their neuronal firing rates in face of input changes or environmental variations. Synaptic Scaling is a particular homeostatic mechanism that acts at the level of the single neuron over long time scales, by changing the gain of all its afferent synapses to maintain the neuron's mean firing within proper operating bounds. In this paper we present ultra low leakage analog circuits that allow the integration of compact integrated filters in multi-neuron chips, able to achieve time constants of the order of hundreds of seconds, and describe automatic gain control circuits that when interfaced to neuromorphic neuron and synapse circuits implement faithful models of biologically realistic Synaptic Scaling mechanisms. We present simulation results of the low leakage circuits and describe the control circuits that have been designed for a neuromorphic multi-neuron chip, fabricated using a standard 180nm CMOS process.

Florentin Worgotter - One of the best experts on this subject based on the ideXlab platform.

  • The Use of Hebbian Cell Assemblies for Nonlinear Computation.
    Scientific Reports, 2015
    Co-Authors: Christian Tetzlaff, Sakyasingha Dasgupta, Tomas Kulvicius, Florentin Worgotter
    Abstract:

    When learning a complex task our nervous system self-organizes large groups of neurons into coherent dynamic activity patterns. During this, a network with multiple, simultaneously active, and computationally powerful cell assemblies is created. How such ordered structures are formed while preserving a rich diversity of neural dynamics needed for computation is still unknown. Here we show that the combination of Synaptic plasticity with the slower process of Synaptic Scaling achieves (i) the formation of cell assemblies and (ii) enhances the diversity of neural dynamics facilitating the learning of complex calculations. Due to Synaptic Scaling the dynamics of different cell assemblies do not interfere with each other. As a consequence, this type of self-organization allows executing a difficult, six degrees of freedom, manipulation task with a robot where assemblies need to learn computing complex non-linear transforms and – for execution – must cooperate with each other without interference. This mechanism, thus, permits the self-organization of computationally powerful sub-structures in dynamic networks for behavior control.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    PLOS Computational Biology, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term Synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and Synaptic differentiation is simultaneously achieved remains unclear. Here we show that Synaptic Scaling – a slow process usually associated with the maintenance of activity homeostasis – combined with Synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and Scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that Scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    BMC Neuroscience, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory formation in the nervous system relies on mechanisms acting on time scales from minutes, for long-term Synaptic plasticity [1], to days, for memory consolidation [2]. During such processes, the neural network distinguishes synapses relevant for forming a long-term storage (LTS), which are consolidated, from synapses of short-term storage (STS), which fade. How time scale integration and Synaptic differentiation is simultaneously achieved within one neural circuit remains unclear. We show in simulations and mean-field analyses that Synaptic Scaling [3] - a slow process usually associated with the maintenance of activity homeostasis - combined with the faster processes of Synaptic plasticity simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. A network intrinsic bifurcation enables this separation as this bifurcation induces different response properties of previously learned cell assemblies due to external memory reactivations. These reactivations could be associated with "sleep-like" activations as, for instance, sharp-wave ripples during slow-wave sleep [4,5]. Additionally, the interaction between plasticity and Scaling provides an explanation for an established paradox where memory consolidation and destabilization critically depends on the exact order of learning and recall. This enables us to reproduce human-psychophysical results [6] on the apparently paradoxical effect of memory destabilization due to memory recall [7]. However, other experimentalists failed to reproduce this memory destabilization effect (e.g., [8]). This ambivalence can be explained by the here proposed bifurcation scenario as the initial conditions and exact timings of recall and learning determine the transition between consolidation and destabilization. Thus, the dynamics of our model yield the fact that memory - similar to the real systems - remains susceptible to perturbations and has to be repeatedly consolidated [2] which could happen during sleep [4,5]. To achieve a final stabilization of memory, systems consolidation, which also begins during sleep [4], performs a transition from a dynamic to a more static memory representation by transferring the information to the neocortex [2]. The processes suggested here are capable of repeatedly (re)consolidating LTS-synapses, while STS-candidates fade. This may thus essentially contribute to providing a stable substrate for systems consolidation.

  • analysis of Synaptic Scaling in combination with hebbian plasticity in several simple networks
    Frontiers in Computational Neuroscience, 2012
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Conventional Synaptic plasticity in combination with Synaptic Scaling is a biologically plausible plasticity rule that guides the development of synapses towards stability. Here we analyze the development of Synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and Scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing Synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with Scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

  • Synaptic Scaling in combination with many generic plasticity mechanisms stabilizes circuit connectivity
    Frontiers in Computational Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Synaptic Scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional Synaptic plasticity in the form of long term depression and potentiation, this changes the Synaptic patterns in a network, ensuring diverse, functionally relevant, stable and input-dependent connectivity. How Synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze Synaptic Scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and Synaptic Scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models, which reproduce experimentally observed Synaptic distributions as well as the observed Synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with Scaling generates globally stable, input-controlled Synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, Synaptic Scaling can robustly yield neuronal circuits with high Synaptic diversity, which potentially allows in a more robust way for the dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. This, could be the basis for the learning of structured behavior even in initially random networks.

Christoph Kolodziejski - One of the best experts on this subject based on the ideXlab platform.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    PLOS Computational Biology, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term Synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and Synaptic differentiation is simultaneously achieved remains unclear. Here we show that Synaptic Scaling – a slow process usually associated with the maintenance of activity homeostasis – combined with Synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and Scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that Scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    BMC Neuroscience, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory formation in the nervous system relies on mechanisms acting on time scales from minutes, for long-term Synaptic plasticity [1], to days, for memory consolidation [2]. During such processes, the neural network distinguishes synapses relevant for forming a long-term storage (LTS), which are consolidated, from synapses of short-term storage (STS), which fade. How time scale integration and Synaptic differentiation is simultaneously achieved within one neural circuit remains unclear. We show in simulations and mean-field analyses that Synaptic Scaling [3] - a slow process usually associated with the maintenance of activity homeostasis - combined with the faster processes of Synaptic plasticity simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. A network intrinsic bifurcation enables this separation as this bifurcation induces different response properties of previously learned cell assemblies due to external memory reactivations. These reactivations could be associated with "sleep-like" activations as, for instance, sharp-wave ripples during slow-wave sleep [4,5]. Additionally, the interaction between plasticity and Scaling provides an explanation for an established paradox where memory consolidation and destabilization critically depends on the exact order of learning and recall. This enables us to reproduce human-psychophysical results [6] on the apparently paradoxical effect of memory destabilization due to memory recall [7]. However, other experimentalists failed to reproduce this memory destabilization effect (e.g., [8]). This ambivalence can be explained by the here proposed bifurcation scenario as the initial conditions and exact timings of recall and learning determine the transition between consolidation and destabilization. Thus, the dynamics of our model yield the fact that memory - similar to the real systems - remains susceptible to perturbations and has to be repeatedly consolidated [2] which could happen during sleep [4,5]. To achieve a final stabilization of memory, systems consolidation, which also begins during sleep [4], performs a transition from a dynamic to a more static memory representation by transferring the information to the neocortex [2]. The processes suggested here are capable of repeatedly (re)consolidating LTS-synapses, while STS-candidates fade. This may thus essentially contribute to providing a stable substrate for systems consolidation.

  • analysis of Synaptic Scaling in combination with hebbian plasticity in several simple networks
    Frontiers in Computational Neuroscience, 2012
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Conventional Synaptic plasticity in combination with Synaptic Scaling is a biologically plausible plasticity rule that guides the development of synapses towards stability. Here we analyze the development of Synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and Scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing Synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with Scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

  • Synaptic Scaling in combination with many generic plasticity mechanisms stabilizes circuit connectivity
    Frontiers in Computational Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Synaptic Scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional Synaptic plasticity in the form of long term depression and potentiation, this changes the Synaptic patterns in a network, ensuring diverse, functionally relevant, stable and input-dependent connectivity. How Synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze Synaptic Scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and Synaptic Scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models, which reproduce experimentally observed Synaptic distributions as well as the observed Synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with Scaling generates globally stable, input-controlled Synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, Synaptic Scaling can robustly yield neuronal circuits with high Synaptic diversity, which potentially allows in a more robust way for the dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. This, could be the basis for the learning of structured behavior even in initially random networks.

  • Synaptic Scaling generically stabilizes circuit connectivity
    BMC Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Neural systems regulate Synaptic plasticity avoiding overly strong growth or shrinkage of the connections, thereby keeping the circuit architecture operational. Accordingly, several experimental studies have shown that Synaptic weights increase only in direct relation to their current value, resulting in reduced growth for stronger synapses [1]. It is, however, difficult to extract from these studies unequivocal evidence about the underlying biophysical mechanisms that control weight growth. The theoretical neurosciences have addressed this problem by exploring mechanisms for Synaptic weight change that contain limiting factors to regulate growth [2]. The effectiveness of these mechanisms is difficult to justify from a biophysical perspective, in particular those that require knowledge of global network status (e.g. knowledge of the ‘sum of all weights’) for normalization. Also spike-timing-dependent plasticity [3] cannot guaranty stability because various types of plasticity exist across different neurons and even at the same neuron, depending on the location of the synapses [1]. Therefore, it remains an open question how neural circuits simultaneously stabilize their many synapses and ensure diversity in the presence of a variety of distinct plasticity mechanisms. In 1998, a series of studies initiated by Turrigiano augmented this discussion by demonstrating that network activity is homeostatically regulated, suggesting that weights ω are regulated by an activity-dependent difference term [4,5]. Accordingly, Synaptic Scaling compares output activity v against a desired target activity vT of each individual neuron [5]. Most straightforwardly, such a local weight change is defined by dω/dt = γH(νT – ν) [6], where the long characteristic time scale (hours up to days) of Synaptic Scaling is determined by a small factor γ << 1. Synaptic Scaling operates in parallel to conventional plasticity and acts simultaneously on different synapses. Here we suggest that Synaptic Scaling is combined with different types of plasticity mechanisms in the same circuit or even at the same neuron and regulates Synaptic diversity across the circuit. We demonstrate that it robustly yields stable and diverse weight distributions which moreover are independent of the individual plasticity mechanism. As Scaling co-acts with plasticity, such a combined mechanism is mathematically characterized by a weight change dω/dt = μG + γH. Here μ defines the rate of change of conventional Synaptic plasticity, γ <<μ << 1, and G and H describe the specific types of plasticity and Scaling, respectively [7]. For example, G is different for plain Hebbian plasticity than for STDP. As we show, combining any type of conventional plasticity G with nonlinear weight-dependent Scaling H naturally yields global Synaptic stabilization across the circuit regardless of the specific form of the plasticity G and also largely independent of the intrinsic neuron dynamics. Our study demonstrates that synapses are stabilized strictly in an input-determined way thereby capturing characteristic features of the inputs to the network. As an important result, we show that such systems are capable of representing a given input pattern via stably changed weights along several stages of signal propagation. This holds even in circuits containing a substantial number of random recurrent connections but no particular additional architecture.

Marc Timme - One of the best experts on this subject based on the ideXlab platform.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    PLOS Computational Biology, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term Synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and Synaptic differentiation is simultaneously achieved remains unclear. Here we show that Synaptic Scaling – a slow process usually associated with the maintenance of activity homeostasis – combined with Synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and Scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that Scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.

  • Synaptic Scaling enables dynamically distinct short and long term memory formation
    BMC Neuroscience, 2013
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Misha Tsodyks, Florentin Worgotter
    Abstract:

    Memory formation in the nervous system relies on mechanisms acting on time scales from minutes, for long-term Synaptic plasticity [1], to days, for memory consolidation [2]. During such processes, the neural network distinguishes synapses relevant for forming a long-term storage (LTS), which are consolidated, from synapses of short-term storage (STS), which fade. How time scale integration and Synaptic differentiation is simultaneously achieved within one neural circuit remains unclear. We show in simulations and mean-field analyses that Synaptic Scaling [3] - a slow process usually associated with the maintenance of activity homeostasis - combined with the faster processes of Synaptic plasticity simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. A network intrinsic bifurcation enables this separation as this bifurcation induces different response properties of previously learned cell assemblies due to external memory reactivations. These reactivations could be associated with "sleep-like" activations as, for instance, sharp-wave ripples during slow-wave sleep [4,5]. Additionally, the interaction between plasticity and Scaling provides an explanation for an established paradox where memory consolidation and destabilization critically depends on the exact order of learning and recall. This enables us to reproduce human-psychophysical results [6] on the apparently paradoxical effect of memory destabilization due to memory recall [7]. However, other experimentalists failed to reproduce this memory destabilization effect (e.g., [8]). This ambivalence can be explained by the here proposed bifurcation scenario as the initial conditions and exact timings of recall and learning determine the transition between consolidation and destabilization. Thus, the dynamics of our model yield the fact that memory - similar to the real systems - remains susceptible to perturbations and has to be repeatedly consolidated [2] which could happen during sleep [4,5]. To achieve a final stabilization of memory, systems consolidation, which also begins during sleep [4], performs a transition from a dynamic to a more static memory representation by transferring the information to the neocortex [2]. The processes suggested here are capable of repeatedly (re)consolidating LTS-synapses, while STS-candidates fade. This may thus essentially contribute to providing a stable substrate for systems consolidation.

  • analysis of Synaptic Scaling in combination with hebbian plasticity in several simple networks
    Frontiers in Computational Neuroscience, 2012
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Conventional Synaptic plasticity in combination with Synaptic Scaling is a biologically plausible plasticity rule that guides the development of synapses towards stability. Here we analyze the development of Synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and Scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing Synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with Scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties.

  • Synaptic Scaling in combination with many generic plasticity mechanisms stabilizes circuit connectivity
    Frontiers in Computational Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Synaptic Scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional Synaptic plasticity in the form of long term depression and potentiation, this changes the Synaptic patterns in a network, ensuring diverse, functionally relevant, stable and input-dependent connectivity. How Synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze Synaptic Scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and Synaptic Scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models, which reproduce experimentally observed Synaptic distributions as well as the observed Synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with Scaling generates globally stable, input-controlled Synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, Synaptic Scaling can robustly yield neuronal circuits with high Synaptic diversity, which potentially allows in a more robust way for the dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. This, could be the basis for the learning of structured behavior even in initially random networks.

  • Synaptic Scaling generically stabilizes circuit connectivity
    BMC Neuroscience, 2011
    Co-Authors: Christian Tetzlaff, Christoph Kolodziejski, Marc Timme, Florentin Worgotter
    Abstract:

    Neural systems regulate Synaptic plasticity avoiding overly strong growth or shrinkage of the connections, thereby keeping the circuit architecture operational. Accordingly, several experimental studies have shown that Synaptic weights increase only in direct relation to their current value, resulting in reduced growth for stronger synapses [1]. It is, however, difficult to extract from these studies unequivocal evidence about the underlying biophysical mechanisms that control weight growth. The theoretical neurosciences have addressed this problem by exploring mechanisms for Synaptic weight change that contain limiting factors to regulate growth [2]. The effectiveness of these mechanisms is difficult to justify from a biophysical perspective, in particular those that require knowledge of global network status (e.g. knowledge of the ‘sum of all weights’) for normalization. Also spike-timing-dependent plasticity [3] cannot guaranty stability because various types of plasticity exist across different neurons and even at the same neuron, depending on the location of the synapses [1]. Therefore, it remains an open question how neural circuits simultaneously stabilize their many synapses and ensure diversity in the presence of a variety of distinct plasticity mechanisms. In 1998, a series of studies initiated by Turrigiano augmented this discussion by demonstrating that network activity is homeostatically regulated, suggesting that weights ω are regulated by an activity-dependent difference term [4,5]. Accordingly, Synaptic Scaling compares output activity v against a desired target activity vT of each individual neuron [5]. Most straightforwardly, such a local weight change is defined by dω/dt = γH(νT – ν) [6], where the long characteristic time scale (hours up to days) of Synaptic Scaling is determined by a small factor γ << 1. Synaptic Scaling operates in parallel to conventional plasticity and acts simultaneously on different synapses. Here we suggest that Synaptic Scaling is combined with different types of plasticity mechanisms in the same circuit or even at the same neuron and regulates Synaptic diversity across the circuit. We demonstrate that it robustly yields stable and diverse weight distributions which moreover are independent of the individual plasticity mechanism. As Scaling co-acts with plasticity, such a combined mechanism is mathematically characterized by a weight change dω/dt = μG + γH. Here μ defines the rate of change of conventional Synaptic plasticity, γ <<μ << 1, and G and H describe the specific types of plasticity and Scaling, respectively [7]. For example, G is different for plain Hebbian plasticity than for STDP. As we show, combining any type of conventional plasticity G with nonlinear weight-dependent Scaling H naturally yields global Synaptic stabilization across the circuit regardless of the specific form of the plasticity G and also largely independent of the intrinsic neuron dynamics. Our study demonstrates that synapses are stabilized strictly in an input-determined way thereby capturing characteristic features of the inputs to the network. As an important result, we show that such systems are capable of representing a given input pattern via stably changed weights along several stages of signal propagation. This holds even in circuits containing a substantial number of random recurrent connections but no particular additional architecture.