Multisensory Integration

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 11349 Experts worldwide ranked by ideXlab platform

Mark T Wallace - One of the best experts on this subject based on the ideXlab platform.

  • selective enhancement of object representations through Multisensory Integration
    bioRxiv, 2019
    Co-Authors: David A Tovar, Micah M Murray, Mark T Wallace
    Abstract:

    Objects are the fundamental building blocks of how we create a representation of the external world. One major distinction amongst objects is between those that are animate versus inanimate. Many objects are specified by more than a single sense, yet the nature by which Multisensory objects are represented by the brain remains poorly understood. Using representational similarity analysis of human EEG signals, we show enhanced encoding of audiovisual objects when compared to their corresponding visual and auditory objects. Surprisingly, we discovered the often-found processing advantages for animate objects was not evident in a Multisensory context due to greater neural enhancement of inanimate objects, the more weakly encoded objects under unisensory conditions. Further analysis showed that the selective enhancement of inanimate audiovisual objects corresponded with an increase in shared representations across brain areas, suggesting that neural enhancement was mediated by Multisensory Integration. Moreover, a distance-to-bound analysis provided critical links between neural findings and behavior. Improvements in neural decoding at the individual exemplar level for audiovisual inanimate objects predicted reaction time differences between Multisensory and unisensory presentations during a go/no-go animate categorization task. Interestingly, links between neural activity and behavioral measures were most prominent 100 to 200ms and 350 to 500ms after stimulus presentation, corresponding to time periods associated with sensory evidence accumulation and decision-making, respectively. Collectively, these findings provide key insights into a fundamental process the brain uses to maximize information it captures across sensory systems to perform object recognition.

  • links between temporal acuity and Multisensory Integration across life span
    Journal of Experimental Psychology: Human Perception and Performance, 2018
    Co-Authors: Ryan A. Stevenson, Sarah H Baum, Juliane Krueger, Paul A Newhouse, Mark T Wallace
    Abstract:

    The temporal relationship between individual pieces of information from the different sensory modalities is one of the stronger cues to integrate such information into a unified perceptual gestalt, conveying numerous perceptual and behavioral advantages. Temporal acuity, however, varies greatly over the life span. It has previously been hypothesized that changes in temporal acuity in both development and healthy aging may thus play a key role in integrative abilities. This study tested the temporal acuity of 138 individuals ranging in age from 5 to 80. Temporal acuity and Multisensory Integration abilities were tested both within and across modalities (audition and vision) with simultaneity judgment and temporal order judgment tasks. We observed that temporal acuity, both within and across modalities, improved throughout development into adulthood and subsequently declined with healthy aging, as did the ability to integrate Multisensory speech information. Of importance, throughout development, temporal acuity of simple stimuli (i.e., flashes and beeps) predicted individuals' abilities to integrate more complex speech information. However, in the aging population, although temporal acuity declined with healthy aging and was accompanied by declines in integrative abilities, temporal acuity was not able to predict Integration at the individual level. Together, these results suggest that the impact of temporal acuity on Multisensory Integration varies throughout the life span. Although the maturation of temporal acuity drives the rise of Multisensory integrative abilities during development, it is unable to account for changes in integrative abilities in healthy aging. The differential relationships between age, temporal acuity, and Multisensory Integration suggest an important role for experience in these processes. (PsycINFO Database Record

  • Identifying and Quantifying Multisensory Integration: A Tutorial Review
    Brain Topography, 2014
    Co-Authors: Ryan A. Stevenson, Juliane Krueger Fister, Nicholas A. Altieri, Aaron R. Nidiffer, Leanne R. Kurela, Justin K. Siemann, Thomas W. James, Dipanwita Ghose, Diana K Sarko, Mark T Wallace
    Abstract:

    We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such Multisensory Integration is how to quantify the Multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify Multisensory Integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify Multisensory processes and Integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify Multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.

  • evidence for diminished Multisensory Integration in autism spectrum disorders
    Journal of Autism and Developmental Disorders, 2014
    Co-Authors: Justin K. Siemann, Ryan A. Stevenson, Tiffany G Woynaroski, Brittany C Schneider, Haley E Eberly, Stephen Camarata, Mark T Wallace
    Abstract:

    Individuals with autism spectrum disorders (ASD) exhibit alterations in sensory processing, including changes in the Integration of information across the different sensory modalities. In the current study, we used the sound-induced flash illusion to assess Multisensory Integration in children with ASD and typically-developing (TD) controls. Thirty-one children with ASD and 31 age and IQ matched TD children (average age = 12 years) were presented with simple visual (i.e., flash) and auditory (i.e., beep) stimuli of varying number. In illusory conditions, a single flash was presented with 2–4 beeps. In TD children, these conditions generally result in the perception of multiple flashes, implying a perceptual fusion across vision and audition. In the present study, children with ASD were significantly less likely to perceive the illusion relative to TD controls, suggesting that Multisensory Integration and cross-modal binding may be weaker in some children with ASD. These results are discussed in the context of previous findings for Multisensory Integration in ASD and future directions for research.

  • On the use of superadditivity as a metric for characterizing Multisensory Integration in functional neuroimaging studies
    Experimental Brain Research, 2005
    Co-Authors: Paul J. Laurienti, Mark T Wallace, Terrence R. Stanford, Thomas J. Perrault, Barry E. Stein
    Abstract:

    A growing number of brain imaging studies are being undertaken in order to better understand the contributions of Multisensory processes to human behavior and perception. Many of these studies are designed on the basis of the physiological findings from single neurons in animal models, which have shown that Multisensory neurons have the capacity for integrating their different sensory inputs and give rise to a product that differs significantly from either of the unisensory responses. At certain points these Multisensory interactions can be superadditive, resulting in a neural response that exceeds the sum of the unisensory responses. Because of the difficulties inherent in interpreting the results of imaging large neuronal populations, superadditivity has been put forth as a stringent criterion for identifying potential sites of Multisensory Integration. In the present manuscript we discuss issues related to using the superadditive model in human brain imaging studies, focusing on population responses to Multisensory stimuli and the relationship between single neuron measures and functional brain imaging measures. We suggest that the results of brain imaging studies be interpreted with caution in regards to Multisensory Integration. Future directions for imaging Multisensory Integration are discussed in light of the ideas presented.

Barry E. Stein - One of the best experts on this subject based on the ideXlab platform.

  • Development of Multisensory Integration from the perspective of the individual neuron
    Nature reviews. Neuroscience, 2014
    Co-Authors: Barry E. Stein, Terrence R. Stanford, Benjamin A. Rowland
    Abstract:

    The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory Integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of Multisensory Integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes Multisensory integrative capabilities for the environment in which the animal will function.

  • initiating the development of Multisensory Integration by manipulating sensory experience
    The Journal of Neuroscience, 2010
    Co-Authors: Benjamin A. Rowland, Barry E. Stein
    Abstract:

    The Multisensory Integration capabilities of superior colliculus neurons emerge gradually during early postnatal life as a consequence of experience with cross-modal stimuli. Without such experience neurons become responsive to multiple sensory modalities but are unable to integrate their inputs. The present study demonstrates that neurons retain sensitivity to cross-modal experience well past the normal developmental period for acquiring Multisensory Integration capabilities. Experience surprisingly late in life was found to rapidly initiate the development of Multisensory Integration, even more rapidly than expected based on its normal developmental time course. Furthermore, the requisite experience was acquired by the anesthetized brain and in the absence of any of the stimulus-response contingencies generally associated with learning. The key experiential factor was repeated exposure to the relevant stimuli, and this required that the multiple receptive fields of a Multisensory neuron encompassed the cross-modal exposure site. Simple exposure to the individual components of a cross-modal stimulus was ineffective in this regard. Furthermore, once a neuron acquired Multisensory Integration capabilities at the exposure site, it generalized this experience to other locations, albeit with lowered effectiveness. These observations suggest that the prolonged period during which Multisensory Integration normally appears is due to developmental factors in neural circuitry in addition to those required for incorporating the statistics of cross-modal events; that neurons learn a Multisensory principle based on the specifics of experience and can then apply it to other stimulus conditions; and that the incorporation of this Multisensory information does not depend on an alert brain.

  • the neural basis of Multisensory Integration in the midbrain its organization and maturation
    Hearing Research, 2009
    Co-Authors: Barry E. Stein, Terrence R. Stanford, Benjamin A. Rowland
    Abstract:

    Multisensory Integration describes a process by which information from different sensory systems is combined to influence perception, decisions, and overt behavior. Despite a widespread appreciation of its utility in the adult, its developmental antecedents have received relatively little attention. Here we review what is known about the development of Multisensory Integration, with a focus on the circuitry and experiential antecedents of its development in the model system of the Multisensory (i.e., deep) layers of the superior colliculus. Of particular interest here are two sets of experimental observations: (1) cortical influences appear essential for Multisensory Integration in the SC, and (2) postnatal experience guides its maturation. The current belief is that the experience normally gained during early life is instantiated in the cortico-SC projection, and that this is the primary route by which ecological pressures adapt SC Multisensory Integration to the particular environment in which it will be used.

  • Challenges in quantifying Multisensory Integration: alternative criteria, models, and inverse effectiveness
    Experimental Brain Research, 2009
    Co-Authors: Barry E. Stein, Terrence R. Stanford, Ramnarayan Ramachandran, Thomas J. Perrault, Benjamin A. Rowland
    Abstract:

    Single-neuron studies provide a foundation for understanding many facets of Multisensory Integration. These studies have used a variety of criteria for identifying and quantifying Multisensory Integration. While a number of techniques have been used, an explicit discussion of the assumptions, criteria, and analytical methods traditionally used to define the principles of Multisensory Integration is lacking. This was not problematic when the field was small, but with rapid growth a number of alternative techniques and models have been introduced, each with its own criteria and sets of implicit assumptions to define and characterize what is thought to be the same phenomenon. The potential for misconception prompted this reexamination of traditional approaches in order to clarify their underlying assumptions and analytic techniques. The objective here is to review and discuss traditional quantitative methods advanced in the study of single-neuron physiology in order to appreciate the process of Multisensory Integration and its impact.

  • Multisensory Integration current issues from the perspective of the single neuron
    Nature Reviews Neuroscience, 2008
    Co-Authors: Barry E. Stein, Terrence R. Stanford
    Abstract:

    Multisensory Integration allows information from multiple senses to be combined, with benefits for nervous-system processing. Stein and Stanford discuss the principles of Multisensory Integration in single neurons in the CNS and consider the questions that the field must address.

Aikaterini Fotopoulou - One of the best experts on this subject based on the ideXlab platform.

  • vestibular modulation of Multisensory Integration during actual and vicarious tactile stimulation
    Psychophysiology, 2019
    Co-Authors: Sonia Ponzo, Louise P Kirsch, Aikaterini Fotopoulou, Paul M Jenkinson
    Abstract:

    The vestibular system has been shown to contribute to Multisensory Integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g., vision), proprioceptive, and interoceptive (e.g., affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e., felt from within vs. seen, vicarious touch). In the current study, we aimed to (a) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network in Multisensory Integration, and (b) examine vestibular contributions to Multisensory Integration when touch is felt but not seen (and vice versa). During artificial vestibular stimulation (LGVS, i.e., right vestibular stimulation), RGVS (i.e., bilateral stimulation), and sham (i.e., placebo stimulation), healthy participants (N = 36, Experiment 1; N = 37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch. We found that (a) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and (b) LGVS (versus sham) favored proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2) and with no difference between affective and neutral touch. We showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in Multisensory Integration according to their epistemic relevance.

  • vestibular modulation of Multisensory Integration during actual and vicarious tactile stimulation
    bioRxiv, 2019
    Co-Authors: Sonia Ponzo, Louise P Kirsch, Aikaterini Fotopoulou, Paul M Jenkinson
    Abstract:

    Abstract Background The vestibular system has been shown to contribute to Multisensory Integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g. vision), proprioceptive and interoceptive (e.g. affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e. felt from within vs seen, vicarious touch). Objective We aimed to i) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network (i.e. LGVS) in Multisensory Integration and ii) examine vestibular contributions to Multisensory Integration when touch is felt but not seen (and vice-versa). Method During artificial vestibular stimulation (LGVS, RGVS and Sham), healthy participants (N=36, Experiment 1; N=37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch. Results We found that i) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and ii) LGVS (vs Sham) favoured proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2), and with no difference between affective and neutral touch. Conclusions We showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in Multisensory Integration according to their epistemic relevance. Highlights LGVS increased proprioceptive drift during vision of a rubber hand Touch on participant’s hand decreased proprioceptive drift during LGVS Vicarious touch on the Rubber Hand increased proprioceptive drift during LGVS Vestibular signals differently balance sensory sources in Multisensory Integration

  • embodied precision intranasal oxytocin modulates Multisensory Integration
    Journal of Cognitive Neuroscience, 2019
    Co-Authors: Laura Crucianelli, Paul M Jenkinson, Yannis Paloyelis, Lucia Ricciardi, Aikaterini Fotopoulou
    Abstract:

    Multisensory Integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting Multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of Multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting Multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied Multisensory Integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on Multisensory Integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile Multisensory Integration by increasing the precision of top-down signals against bottom-up sensory input.

  • embodied precision intranasal oxytocin modulates Multisensory Integration
    bioRxiv, 2018
    Co-Authors: Laura Crucianelli, Paul M Jenkinson, Yannis Paloyelis, Lucia Ricciardi, Aikaterini Fotopoulou
    Abstract:

    Multisensory Integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting Multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant9s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of Multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting Multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied Multisensory Integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on Multisensory Integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile Multisensory Integration by increasing the precision of top-down signals against bottom-up sensory input.

Paul M Jenkinson - One of the best experts on this subject based on the ideXlab platform.

  • vestibular modulation of Multisensory Integration during actual and vicarious tactile stimulation
    Psychophysiology, 2019
    Co-Authors: Sonia Ponzo, Louise P Kirsch, Aikaterini Fotopoulou, Paul M Jenkinson
    Abstract:

    The vestibular system has been shown to contribute to Multisensory Integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g., vision), proprioceptive, and interoceptive (e.g., affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e., felt from within vs. seen, vicarious touch). In the current study, we aimed to (a) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network in Multisensory Integration, and (b) examine vestibular contributions to Multisensory Integration when touch is felt but not seen (and vice versa). During artificial vestibular stimulation (LGVS, i.e., right vestibular stimulation), RGVS (i.e., bilateral stimulation), and sham (i.e., placebo stimulation), healthy participants (N = 36, Experiment 1; N = 37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch. We found that (a) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and (b) LGVS (versus sham) favored proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2) and with no difference between affective and neutral touch. We showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in Multisensory Integration according to their epistemic relevance.

  • vestibular modulation of Multisensory Integration during actual and vicarious tactile stimulation
    bioRxiv, 2019
    Co-Authors: Sonia Ponzo, Louise P Kirsch, Aikaterini Fotopoulou, Paul M Jenkinson
    Abstract:

    Abstract Background The vestibular system has been shown to contribute to Multisensory Integration by balancing conflictual sensory information. It remains unclear whether such modulation of exteroceptive (e.g. vision), proprioceptive and interoceptive (e.g. affective touch) sensory sources is influenced by epistemically different aspects of tactile stimulation (i.e. felt from within vs seen, vicarious touch). Objective We aimed to i) replicate previous findings regarding the effects of galvanic stimulation of the right vestibular network (i.e. LGVS) in Multisensory Integration and ii) examine vestibular contributions to Multisensory Integration when touch is felt but not seen (and vice-versa). Method During artificial vestibular stimulation (LGVS, RGVS and Sham), healthy participants (N=36, Experiment 1; N=37, Experiment 2) looked at a rubber hand while either their own unseen hand or the rubber hand were touched by affective or neutral touch. Results We found that i) LGVS led to enhancement of vision over proprioception during visual only conditions (replicating our previous findings), and ii) LGVS (vs Sham) favoured proprioception over vision when touch was felt (Experiment 1), with the opposite results when touch was vicariously perceived via vision (Experiment 2), and with no difference between affective and neutral touch. Conclusions We showed how vestibular signals modulate the weight of each sensory modality according to the context in which they are perceived and that such modulation extends to different aspects of tactile stimulation: felt and seen touch are differentially balanced in Multisensory Integration according to their epistemic relevance. Highlights LGVS increased proprioceptive drift during vision of a rubber hand Touch on participant’s hand decreased proprioceptive drift during LGVS Vicarious touch on the Rubber Hand increased proprioceptive drift during LGVS Vestibular signals differently balance sensory sources in Multisensory Integration

  • embodied precision intranasal oxytocin modulates Multisensory Integration
    Journal of Cognitive Neuroscience, 2019
    Co-Authors: Laura Crucianelli, Paul M Jenkinson, Yannis Paloyelis, Lucia Ricciardi, Aikaterini Fotopoulou
    Abstract:

    Multisensory Integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting Multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant's hidden hand and a visible rubber hand creates illusory body ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of Multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting Multisensory input. Recent hypotheses propose that the precision of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine, and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied Multisensory Integration. The present, double-blind, placebo-controlled, crossover study ( n = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on Multisensory Integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased an embodied version of the SWI (quantified as estimation error during a weight estimation task). These findings suggest that oxytocin might modulate processes of visuotactile Multisensory Integration by increasing the precision of top-down signals against bottom-up sensory input.

  • embodied precision intranasal oxytocin modulates Multisensory Integration
    bioRxiv, 2018
    Co-Authors: Laura Crucianelli, Paul M Jenkinson, Yannis Paloyelis, Lucia Ricciardi, Aikaterini Fotopoulou
    Abstract:

    Multisensory Integration processes are fundamental to our sense of self as embodied beings. Bodily illusions, such as the rubber hand illusion (RHI) and the size-weight illusion (SWI), allow us to investigate how the brain resolves conflicting Multisensory evidence during perceptual inference in relation to different facets of body representation. In the RHI, synchronous tactile stimulation of a participant9s hidden hand and a visible rubber hand creates illusory bodily ownership; in the SWI, the perceived size of the body can modulate the estimated weight of external objects. According to Bayesian models, such illusions arise as an attempt to explain the causes of Multisensory perception and may reflect the attenuation of somatosensory precision, which is required to resolve perceptual hypotheses about conflicting Multisensory input. Recent hypotheses propose that the precision or salience of sensorimotor representations is determined by modulators of synaptic gain, like dopamine, acetylcholine and oxytocin. However, these neuromodulatory hypotheses have not been tested in the context of embodied Multisensory Integration. The present, double-blind, placebo-controlled, crossed-over study (N = 41 healthy volunteers) aimed to investigate the effect of intranasal oxytocin (IN-OT) on Multisensory Integration processes, tested by means of the RHI and the SWI. Results showed that IN-OT enhanced the subjective feeling of ownership in the RHI, only when synchronous tactile stimulation was involved. Furthermore, IN-OT increased the embodied version of the SWI (quantified as weight estimation error). These findings suggest that oxytocin might modulate processes of visuo-tactile Multisensory Integration by increasing the precision of top-down signals against bottom-up sensory input.

Ryan A. Stevenson - One of the best experts on this subject based on the ideXlab platform.

  • Schizotypal personality traits and Multisensory Integration: An investigation using the McGurk effect
    'Elsevier BV', 2021
    Co-Authors: Anne-marie Muller, Tyler C. Dalal, Ryan A. Stevenson
    Abstract:

    Multisensory Integration, the process by which sensory information from different sensory modalities are bound together, is hypothesized to contribute to perceptual symptomatology in schizophrenia, in whom Multisensory Integration differences have been consistently found. Evidence is emerging that these differences extend across the schizophrenia spectrum, including individuals in the general population with higher levels of schizotypal traits. In the current study, we used the McGurk task as a measure of Multisensory Integration. We measured schizotypal traits using the Schizotypal Personality Questionnaire (SPQ), hypothesizing that higher levels of schizotypal traits, specifically Unusual Perceptual Experiences and Odd Speech subscales, would be associated with decreased Multisensory Integration of speech. Surprisingly, Unusual Perceptual Experiences were not associated with Multisensory Integration. However, Odd Speech was associated with Multisensory Integration, and this association extended more broadly across the Disorganized factor of the SPQ, including Odd or Eccentric Behaviour. Individuals with higher levels of Odd or Eccentric Behaviour scores also demonstrated poorer lip-reading abilities, which partially explained performance in the McGurk task. This suggests that aberrant perceptual processes affecting individuals across the schizophrenia spectrum may relate to disorganized symptomatology

  • links between temporal acuity and Multisensory Integration across life span
    Journal of Experimental Psychology: Human Perception and Performance, 2018
    Co-Authors: Ryan A. Stevenson, Sarah H Baum, Juliane Krueger, Paul A Newhouse, Mark T Wallace
    Abstract:

    The temporal relationship between individual pieces of information from the different sensory modalities is one of the stronger cues to integrate such information into a unified perceptual gestalt, conveying numerous perceptual and behavioral advantages. Temporal acuity, however, varies greatly over the life span. It has previously been hypothesized that changes in temporal acuity in both development and healthy aging may thus play a key role in integrative abilities. This study tested the temporal acuity of 138 individuals ranging in age from 5 to 80. Temporal acuity and Multisensory Integration abilities were tested both within and across modalities (audition and vision) with simultaneity judgment and temporal order judgment tasks. We observed that temporal acuity, both within and across modalities, improved throughout development into adulthood and subsequently declined with healthy aging, as did the ability to integrate Multisensory speech information. Of importance, throughout development, temporal acuity of simple stimuli (i.e., flashes and beeps) predicted individuals' abilities to integrate more complex speech information. However, in the aging population, although temporal acuity declined with healthy aging and was accompanied by declines in integrative abilities, temporal acuity was not able to predict Integration at the individual level. Together, these results suggest that the impact of temporal acuity on Multisensory Integration varies throughout the life span. Although the maturation of temporal acuity drives the rise of Multisensory integrative abilities during development, it is unable to account for changes in integrative abilities in healthy aging. The differential relationships between age, temporal acuity, and Multisensory Integration suggest an important role for experience in these processes. (PsycINFO Database Record

  • Identifying and Quantifying Multisensory Integration: A Tutorial Review
    Brain Topography, 2014
    Co-Authors: Ryan A. Stevenson, Juliane Krueger Fister, Nicholas A. Altieri, Aaron R. Nidiffer, Leanne R. Kurela, Justin K. Siemann, Thomas W. James, Dipanwita Ghose, Diana K Sarko, Mark T Wallace
    Abstract:

    We process information from the world through multiple senses, and the brain must decide what information belongs together and what information should be segregated. One challenge in studying such Multisensory Integration is how to quantify the Multisensory interactions, a challenge that is amplified by the host of methods that are now used to measure neural, behavioral, and perceptual responses. Many of the measures that have been developed to quantify Multisensory Integration (and which have been derived from single unit analyses), have been applied to these different measures without much consideration for the nature of the process being studied. Here, we provide a review focused on the means with which experimenters quantify Multisensory processes and Integration across a range of commonly used experimental methodologies. We emphasize the most commonly employed measures, including single- and multiunit responses, local field potentials, functional magnetic resonance imaging, and electroencephalography, along with behavioral measures of detection, accuracy, and response times. In each section, we will discuss the different metrics commonly used to quantify Multisensory interactions, including the rationale for their use, their advantages, and the drawbacks and caveats associated with them. Also discussed are possible alternatives to the most commonly used metrics.

  • evidence for diminished Multisensory Integration in autism spectrum disorders
    Journal of Autism and Developmental Disorders, 2014
    Co-Authors: Justin K. Siemann, Ryan A. Stevenson, Tiffany G Woynaroski, Brittany C Schneider, Haley E Eberly, Stephen Camarata, Mark T Wallace
    Abstract:

    Individuals with autism spectrum disorders (ASD) exhibit alterations in sensory processing, including changes in the Integration of information across the different sensory modalities. In the current study, we used the sound-induced flash illusion to assess Multisensory Integration in children with ASD and typically-developing (TD) controls. Thirty-one children with ASD and 31 age and IQ matched TD children (average age = 12 years) were presented with simple visual (i.e., flash) and auditory (i.e., beep) stimuli of varying number. In illusory conditions, a single flash was presented with 2–4 beeps. In TD children, these conditions generally result in the perception of multiple flashes, implying a perceptual fusion across vision and audition. In the present study, children with ASD were significantly less likely to perceive the illusion relative to TD controls, suggesting that Multisensory Integration and cross-modal binding may be weaker in some children with ASD. These results are discussed in the context of previous findings for Multisensory Integration in ASD and future directions for research.