Incoming Sensory Information

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 327 Experts worldwide ranked by ideXlab platform

John T Serences - One of the best experts on this subject based on the ideXlab platform.

  • Coexisting representations of Sensory and mnemonic Information in human visual cortex.
    Nature neuroscience, 2019
    Co-Authors: Rosanne L. Rademaker, Chaipat Chunharas, John T Serences
    Abstract:

    Traversing Sensory environments requires keeping relevant Information in mind while simultaneously processing new inputs. Visual Information is kept in working memory via feature-selective responses in early visual cortex, but recent work has suggested that new Sensory inputs obligatorily wipe out this Information. Here we show region-wide multiplexing abilities in classic Sensory areas, with population-level response patterns in early visual cortex representing the contents of working memory alongside new Sensory inputs. In a second experiment, we show that when people get distracted, this leads to both disruptions of mnemonic Information in early visual cortex and decrements in behavioral recall. Representations in the intraparietal sulcus reflect actively remembered Information encoded in a transformed format, but not task-irrelevant Sensory inputs. Together, these results suggest that early visual areas play a key role in supporting high-resolution working memory representations that can serve as a template for comparison with Incoming Sensory Information.

  • area spt in the human planum temporale supports Sensory motor integration for speech processing
    Journal of Neurophysiology, 2009
    Co-Authors: Gregory Hickok, Kayoko Okada, John T Serences
    Abstract:

    Processing Incoming Sensory Information and transforming this input into appropriate motor responses is a critical and ongoing aspect of our moment-to-moment interaction with the environment. While the neural mechanisms in the posterior parietal cortex (PPC) that support the transformation of Sensory inputs into simple eye or limb movements has received a great deal of empirical attention—in part because these processes are easy to study in nonhuman primates—little work has been done on Sensory-motor transformations in the domain of speech. Here we used functional magnetic resonance imaging and multivariate analysis techniques to demonstrate that a region of the planum temporale (Spt) shows distinct spatial activation patterns during Sensory and motor aspects of a speech task. This result suggests that just as the PPC supports sensorimotor integration for eye and limb movements, area Spt forms part of a Sensory-motor integration circuit for the vocal tract.

  • Value-based modulations in human visual cortex.
    Neuron, 2008
    Co-Authors: John T Serences
    Abstract:

    Economists and cognitive psychologists have long known that prior rewards bias decision making in favor of options with high expected value. Accordingly, value modulates the activity of sensorimotor neurons involved in initiating movements toward one of two competing decision alternatives. However, little is known about how value influences the acquisition and representation of Incoming Sensory Information or about the neural mechanisms that track the relative value of each available stimulus to guide behavior. Here, fMRI revealed value-related modulations throughout spatially selective areas of the human visual system in the absence of overt saccadic responses (including in V1). These modulations were primarily associated with the reward history of each stimulus and not to self-reported estimates of stimulus value. Finally, subregions of frontal and parietal cortex represent the differential value of competing alternatives and may provide signals to bias spatially selective visual areas in favor of more valuable stimuli.

Hinze Hogendoorn - One of the best experts on this subject based on the ideXlab platform.

  • Predictions drive neural representations of visual events ahead of Incoming Sensory Information.
    Proceedings of the National Academy of Sciences of the United States of America, 2020
    Co-Authors: Tessel Blom, Daniel Feuerriegel, Philippa Johnson, Stefan Bode, Hinze Hogendoorn
    Abstract:

    The transmission of Sensory Information through the visual system takes time. As a result of these delays, the visual Information available to the brain always lags behind the timing of events in the present moment. Compensating for these delays is crucial for functioning within dynamic environments, since interacting with a moving object (e.g., catching a ball) requires real-time localization of the object. One way the brain might achieve this is via prediction of anticipated events. Using time-resolved decoding of electroencephalographic (EEG) data, we demonstrate that the visual system represents the anticipated future position of a moving object, showing that predictive mechanisms activate the same neural representations as afferent Sensory input. Importantly, this activation is evident before Sensory input corresponding to the stimulus position is able to arrive. Finally, we demonstrate that, when predicted events do not eventuate, Sensory Information arrives too late to prevent the visual system from representing what was expected but never presented. Taken together, we demonstrate how the visual system can implement predictive mechanisms to preactivate Sensory representations, and argue that this might allow it to compensate for its own temporal constraints, allowing us to interact with dynamic visual environments in real time.

Michael D'zmura - One of the best experts on this subject based on the ideXlab platform.

  • Effects of unexpected visual motion on postural sway and motion sickness.
    Applied ergonomics, 2018
    Co-Authors: Mark Dennison, Michael D'zmura
    Abstract:

    Abstract Motion sickness is thought to occur when the brain's assumptions about Incoming Sensory Information do not match the actual signals received. These signals must involve the vestibular system for motion sickness to occur. In this paper, we describe an experiment in which subjects experienced unexpected visual motions, or perturbations, as they navigated a virtual environment (VE) while standing and wearing a head mounted display (HMD) or while viewing a monitor. We found that postural instability, as measured by a balance board, increased with time only when perturbations were present. HMD users exhibited greater sway when exposed to visual perturbations than did monitor users. Yet motion sickness increased only when an HMD was used and occurred with or without participants undergoing perturbations. These results suggest that the postural instability which is generated by unexpected visual perturbation does not necessarily increase the likelihood of motion sickness in a virtual environment.

Mitra J. Z. Hartmann - One of the best experts on this subject based on the ideXlab platform.

  • Using hardware models to quantify Sensory data acquisition across the rat vibrissal array.
    Bioinspiration & biomimetics, 2007
    Co-Authors: V. Gopal, Mitra J. Z. Hartmann
    Abstract:

    Our laboratory investigates how animals acquire Sensory data to understand the neural computations that permit complex sensorimotor behaviors. We use the rat whisker system as a model to study active tactile sensing; our aim is to quantitatively describe the spatiotemporal structure of Incoming Sensory Information to place constraints on subsequent neural encoding and processing. In the first part of this paper we describe the steps in the development of a hardware model (a 'sensobot') of the rat whisker array that can perform object feature extraction. We show how this model provides insights into the neurophysiology and behavior of the real animal. In the second part of this paper, we suggest that Sensory data acquisition across the whisker array can be quantified using the complete derivative. We use the example of wall-following behavior to illustrate that computing the appropriate spatial gradients across a sensor array would enable an animal or mobile robot to predict the Sensory data that will be acquired at the next time step.

Wallace B. Mendelson - One of the best experts on this subject based on the ideXlab platform.

  • Neurotransmitters and sleep.
    The Journal of Clinical Psychiatry, 2001
    Co-Authors: Wallace B. Mendelson
    Abstract:

    Sleep is an active process, not just a default state when there is less Incoming Sensory Information. It can be understood best by considering fluctuating levels of a series of neurotransmitters including the biogenic amines and acetylcholine. The effects of these neurotransmitters are not unique to sleep, but also subserve a wide range of other functions, including affect, sexual behavior, and appetite. The mechanism by which the most common hypnotics work is by binding to the benzodiazepine recognition site of the gamma-aminobutyric acidA-benzodiazepine receptor complex, which mediates action of the most widely distributed inhibitory neurotransmitter in the nervous system. It is possible that some endogenous sleep factors indirectly alter the properties of this receptor complex.