Cursor Position

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 2493 Experts worldwide ranked by ideXlab platform

Dario Farina - One of the best experts on this subject based on the ideXlab platform.

  • Wearable multichannel haptic device for encoding proprioception in the upper limb
    Journal of Neural Engineering, 2020
    Co-Authors: Patrick Sagastegui Alva, Silvia Muceli, Seyed Farokh Atashzar, Lucie William, Dario Farina
    Abstract:

    Objective. We present the design, implementation, and evaluation of a wearable multichannel haptic system. The device is a wireless closed-loop armband driven by surface electromyography and provides sensory feedback encoding proprioception. The study is motivated by restoring proprioception information in upper limb prostheses. Approach. The armband comprises eight vibrotactile actuators that generate distributed patterns of mechanical waves around the limb to stimulate perception and to transfer proportional information on the arm motion. An experimental study was conducted to assess : the sensory threshold in 8 locations around the fore arm, the user adaptation to the sensation provided by the device, the user performance in discriminating multiple stimulation levels, and the device performance in coding proprioception using four spatial patterns of stimulation. Eight able-bodied individuals performed reaching tasks by controlling a Cursor with an EMG interface in a virtual environment. Vibrotactile patterns were tested with and without visual information on the Cursor Position with the addition of a random rotation of the reference control system to disturb the natural control and proprioception. Results. The sensation threshold depended on the actuator Position and increased over time.,The maximum resolution for stimuli discrimination was four. Using this resolution, four patterns of vibrotactile activation with different spatial and magnitude properties were generated to evaluate their performance in enhancing proprioception. The optimal vibration pattern varied among the participants. When the feedback was used in closed-loop control with the EMG interface, the task success rate, completion time, execution efficiency, and average target-Cursor distance improved for the optimal stimulation pattern compared to the condition without visual or haptic information on the Cursor Position. Significance. The results indicate that the vibrotactile device enhanced the participants’ perceptualability, suggesting that the proposed closed-loop system has the potential to code pro-prioception and enhance user performance in the presence of perceptual pertubation.

Ziyun Wang - One of the best experts on this subject based on the ideXlab platform.

  • compensate the speech recognition delays for accurate speech based Cursor Position control
    International Conference on Human-Computer Interaction, 2009
    Co-Authors: Qiang Tong, Ziyun Wang
    Abstract:

    In this paper, we describe a back-compensate mechanism to improve the precision of speech-based Cursor control. Using this mechanism we can control the Cursor more easily to move to small on-screen targets during continuous direction-based navigation despite the processing delays associated with speech recognition. In comparison, using traditional speech-recognition systems, it is difficult to move the Cursor precisely to a desired Position because of the processing delays introduced by speech recognition. We also describe an experiment in which we evaluated the two alternative solutions, one using the traditional speech-based Cursor control, and the other using the back-compensate mechanism. We present the encouraging evaluation results at the end of this paper and discuss future work.

  • HCI (2) - Compensate the Speech Recognition Delays for Accurate Speech-Based Cursor Position Control
    Human-Computer Interaction. Novel Interaction Methods and Techniques, 2009
    Co-Authors: Qiang Tong, Ziyun Wang
    Abstract:

    In this paper, we describe a back-compensate mechanism to improve the precision of speech-based Cursor control. Using this mechanism we can control the Cursor more easily to move to small on-screen targets during continuous direction-based navigation despite the processing delays associated with speech recognition. In comparison, using traditional speech-recognition systems, it is difficult to move the Cursor precisely to a desired Position because of the processing delays introduced by speech recognition. We also describe an experiment in which we evaluated the two alternative solutions, one using the traditional speech-based Cursor control, and the other using the back-compensate mechanism. We present the encouraging evaluation results at the end of this paper and discuss future work.

Patrick Sagastegui Alva - One of the best experts on this subject based on the ideXlab platform.

  • Wearable multichannel haptic device for encoding proprioception in the upper limb
    Journal of Neural Engineering, 2020
    Co-Authors: Patrick Sagastegui Alva, Silvia Muceli, Seyed Farokh Atashzar, Lucie William, Dario Farina
    Abstract:

    Objective. We present the design, implementation, and evaluation of a wearable multichannel haptic system. The device is a wireless closed-loop armband driven by surface electromyography and provides sensory feedback encoding proprioception. The study is motivated by restoring proprioception information in upper limb prostheses. Approach. The armband comprises eight vibrotactile actuators that generate distributed patterns of mechanical waves around the limb to stimulate perception and to transfer proportional information on the arm motion. An experimental study was conducted to assess : the sensory threshold in 8 locations around the fore arm, the user adaptation to the sensation provided by the device, the user performance in discriminating multiple stimulation levels, and the device performance in coding proprioception using four spatial patterns of stimulation. Eight able-bodied individuals performed reaching tasks by controlling a Cursor with an EMG interface in a virtual environment. Vibrotactile patterns were tested with and without visual information on the Cursor Position with the addition of a random rotation of the reference control system to disturb the natural control and proprioception. Results. The sensation threshold depended on the actuator Position and increased over time.,The maximum resolution for stimuli discrimination was four. Using this resolution, four patterns of vibrotactile activation with different spatial and magnitude properties were generated to evaluate their performance in enhancing proprioception. The optimal vibration pattern varied among the participants. When the feedback was used in closed-loop control with the EMG interface, the task success rate, completion time, execution efficiency, and average target-Cursor distance improved for the optimal stimulation pattern compared to the condition without visual or haptic information on the Cursor Position. Significance. The results indicate that the vibrotactile device enhanced the participants’ perceptualability, suggesting that the proposed closed-loop system has the potential to code pro-prioception and enhance user performance in the presence of perceptual pertubation.

Herbert Heuer - One of the best experts on this subject based on the ideXlab platform.

  • Explicit knowledge of sensory non-redundancy can reduce the strength of multisensory integration
    Psychological Research, 2020
    Co-Authors: Nienke B. Debats, Herbert Heuer
    Abstract:

    The brain integrates incoming sensory signals to a degree that depends on the signals’ redundancy. Redundancy—which is commonly high when signals originate from a common physical object or event—is estimated by the brain from the signals’ spatial and/or temporal correspondence. Here we tested whether verbally instructed knowledge of non-redundancy can also be used to reduce the strength of the sensory integration. We used a Cursor-control task in which Cursor motions in the frontoparallel plane were controlled by hand movements in the horizontal plane, yet with a small and randomly varying visuomotor rotation that created spatial discrepancies between hand and Cursor Positions. Consistent with previous studies, we found mutual biases in the hand and Cursor Position judgments, indicating partial sensory integration. The integration was reduced in strength, but not eliminated, after participants were verbally informed about the non-redundancy (i.e., the spatial discrepancies) in the hand and Cursor Positions. Comparisons with model predictions excluded confounding bottom-up effects of the non-redundancy instruction. Our findings thus show that participants have top-down control over the degree to which they integrate sensory information. Additionally, we found that the magnitude of this top-down modulatory capability is a reliable individual trait. A comparison between participants with and without video-gaming experience tentatively suggested a relation between top-down modulation of integration strength and attentional control.

  • Data_Sheet_1_Effects of Hand and Hemispace on Multisensory Integration of Hand Position and Visual Feedback.pdf
    2019
    Co-Authors: Miya K. Rand, Herbert Heuer
    Abstract:

    The brain generally integrates a multitude of sensory signals to form a unified percept. Even in Cursor control tasks, such as reaching while looking at rotated visual feedback on a monitor, visual information on Cursor Position and proprioceptive information on hand Position are partially integrated (sensory coupling), resulting in mutual biases of the perceived Positions of Cursor and hand. Previous studies showed that the strength of sensory coupling (sum of the mutual biases) depends on the experience of kinematic correlations between hand movements and Cursor motions, whereas the asymmetry of sensory coupling (difference between the biases) depends on the relative reliabilities (inverse of variability) of hand-Position and Cursor-Position estimates (reliability rule). Furthermore, the precision of movement control and perception of hand Position are known to differ between hands (left, right) and workspaces (ipsilateral, contralateral), and so does the experience of kinematic correlations from daily life activities. Thus, in the present study, we tested whether strength and asymmetry of sensory coupling for the endpoints of reaches in a Cursor control task differ between the right and left hand and between ipsilateral and contralateral hemispace. No differences were found in the strength of sensory coupling between hands or between hemispaces. However, asymmetry of sensory coupling was less in ipsilateral than in contralateral hemispace: in ipsilateral hemispace, the bias of the perceived hand Position was reduced, which was accompanied by a smaller variability of the estimates. The variability of Position estimates of the dominant right hand was also less than for the non-dominant left hand, but this difference was not accompanied by a difference in the asymmetry of sensory coupling – a violation of the reliability rule, probably due a stronger influence of visual information on right-hand movements. According to these results, the long-term effects of the experienced kinematic correlation between hand movements and Cursor motions on the strength of sensory coupling are generic and not specific for hemispaces or hands, whereas the effects of relative reliabilities on the asymmetry of sensory coupling are specific for hemispaces but not for hands.

  • Effects of Hand and Hemispace on Multisensory Integration of Hand Position and Visual Feedback
    Frontiers Media S.A., 2019
    Co-Authors: Miya K. Rand, Herbert Heuer
    Abstract:

    The brain generally integrates a multitude of sensory signals to form a unified percept. Even in Cursor control tasks, such as reaching while looking at rotated visual feedback on a monitor, visual information on Cursor Position and proprioceptive information on hand Position are partially integrated (sensory coupling), resulting in mutual biases of the perceived Positions of Cursor and hand. Previous studies showed that the strength of sensory coupling (sum of the mutual biases) depends on the experience of kinematic correlations between hand movements and Cursor motions, whereas the asymmetry of sensory coupling (difference between the biases) depends on the relative reliabilities (inverse of variability) of hand-Position and Cursor-Position estimates (reliability rule). Furthermore, the precision of movement control and perception of hand Position are known to differ between hands (left, right) and workspaces (ipsilateral, contralateral), and so does the experience of kinematic correlations from daily life activities. Thus, in the present study, we tested whether strength and asymmetry of sensory coupling for the endpoints of reaches in a Cursor control task differ between the right and left hand and between ipsilateral and contralateral hemispace. No differences were found in the strength of sensory coupling between hands or between hemispaces. However, asymmetry of sensory coupling was less in ipsilateral than in contralateral hemispace: in ipsilateral hemispace, the bias of the perceived hand Position was reduced, which was accompanied by a smaller variability of the estimates. The variability of Position estimates of the dominant right hand was also less than for the non-dominant left hand, but this difference was not accompanied by a difference in the asymmetry of sensory coupling – a violation of the reliability rule, probably due a stronger influence of visual information on right-hand movements. According to these results, the long-term effects of the experienced kinematic correlation between hand movements and Cursor motions on the strength of sensory coupling are generic and not specific for hemispaces or hands, whereas the effects of relative reliabilities on the asymmetry of sensory coupling are specific for hemispaces but not for hands

  • Dissociating explicit and implicit measures of sensed hand Position in tool use: Effect of relative frequency of judging different objects.
    Attention Perception & Psychophysics, 2017
    Co-Authors: Miya K. Rand, Herbert Heuer
    Abstract:

    In a Cursor-control task, the sensed Positions of Cursor and hand are biased toward each other. We previously found different characteristics of implicit and explicit measures of the bias of sensed hand Position toward the Position of the Cursor, suggesting the existence of distinct neural representations. Here we further explored differences between the two types of measure by varying the proportions of trials with explicit hand-Position (H) and Cursor-Position (C) judgments (C20:H80, C50:H50, and C80:H20). In each trial, participants made a reaching movement to a remembered target, with the visual feedback being rotated randomly, and subsequently they judged the hand or the Cursor Position. Both the explicitly and implicitly measured biases of sensed hand Position were stronger with a low proportion (C80:H20) than with a high proportion (C20:H80) of hand-Position judgments, suggesting that both measures place more weight on the sensory modality relevant for the more frequent judgment. With balanced proportions of such judgments (C50:H50), the explicitly assessed biases were similar to those observed with a high proportion of Cursor-Position judgments (C80:H20), whereas the implicitly assessed biases were similar to those observed with a high proportion of hand-Position judgments (C20:H80). Because strong weights of Cursor-Position or hand-Position information may be difficult to increase further but are easy to reduce, the findings suggest that the implicit measure of the bias of sensed hand Position places a relatively stronger weight on proprioceptive hand-Position information, which is increased no further by a high proportion of hand-Position judgments. Conversely, the explicit measure places a relatively stronger weight on visual Cursor-Position information.

Heuer Herbert - One of the best experts on this subject based on the ideXlab platform.

  • Explicit knowledge of sensory non-redundancy can reduce the strength of multisensory integration
    'Springer Science and Business Media LLC', 2020
    Co-Authors: Debats Nienke, Heuer Herbert
    Abstract:

    Debats N, Heuer H. Explicit knowledge of sensory non-redundancy can reduce the strength of multisensory integration. PSYCHOLOGICAL RESEARCH-PSYCHOLOGISCHE FORSCHUNG. 2020;84(4):890-906.The brain integrates incoming sensory signals to a degree that depends on the signals' redundancy. Redundancy-which is commonly high when signals originate from a common physical object or event-is estimated by the brain from the signals' spatial and/or temporal correspondence. Here we tested whether verbally instructed knowledge of non-redundancy can also be used to reduce the strength of the sensory integration. We used a Cursor-control task in which Cursor motions in the frontoparallel plane were controlled by hand movements in the horizontal plane, yet with a small and randomly varying visuomotor rotation that created spatial discrepancies between hand and Cursor Positions. Consistent with previous studies, we found mutual biases in the hand and Cursor Position judgments, indicating partial sensory integration. The integration was reduced in strength, but not eliminated, after participants were verbally informed about the non-redundancy (i.e., the spatial discrepancies) in the hand and Cursor Positions. Comparisons with model predictions excluded confounding bottom-up effects of the non-redundancy instruction. Our findings thus show that participants have top-down control over the degree to which they integrate sensory information. Additionally, we found that the magnitude of this top-down modulatory capability is a reliable individual trait. A comparison between participants with and without video-gaming experience tentatively suggested a relation between top-down modulation of integration strength and attentional control

  • Sensory integration of movements and their visual effects is not enhanced by spatial proximity
    2018
    Co-Authors: Debats, Nienke B., Heuer Herbert
    Abstract:

    Spatial proximity enhances the sensory integration of exafferent Position information, likely because it indicates whether the information comes from a single physical source. Does spatial proximity also affect the integration of Position information regarding an action (here a hand movement) with that of its visual effect (here a Cursor motion), that is, when the sensory information comes from physically distinct objects? In this study, participants made out-and-back hand movements whereby the outward movements were accompanied by corresponding Cursor motions on a monitor. Their subsequent judgments of hand or Cursor movement endpoints are typically biased toward each other, consistent with an underlying optimal integration mechanism. To study the effect of spatial proximity, we presented the hand and Cursor either in orthogonal planes (horizontal and frontal, respectively) or we aligned them in the horizontal plane. We did not find the expected enhanced integration strength in the latter spatial condition. As a secondary question we asked whether spatial transformations required for the Position judgments (i.e., horizontal to frontal or vice versa) could be the origin of previously observed suboptimal variances of the integrated hand and Cursor Position judgments. We found, however, that the suboptimality persisted when spatial transformations were omitted (i.e., with the hand and Cursor in the same plane). Our findings thus clearly show that the integration of actions with their visual effects is, at least for Cursor control, independent of spatial proximity

  • Sensory integration of movements and their visual effects is not enhanced by spatial proximity
    'Association for Research in Vision and Ophthalmology (ARVO)', 2018
    Co-Authors: Debats Nienke, Heuer Herbert
    Abstract:

    Debats N, Heuer H. Sensory integration of movements and their visual effects is not enhanced by spatial proximity. Journal of Vision. 2018;18(11): 15.Spatial proximity enhances the sensory integration of exafferent Position information, likely because it indicates whether the information comes from a single physical source. Does spatial proximity also affect the integration of Position information regarding an action (here a hand movement) with that of its visual effect (here a Cursor motion), that is, when the sensory information comes from physically distinct objects? In this study, participants made out-and-back hand movements whereby the outward movements were accompanied by corresponding Cursor motions on a monitor. Their subsequent judgments of hand or Cursor movement endpoints are typically biased toward each other, consistent with an underlying optimal integration mechanism. To study the effect of spatial proximity, we presented the hand and Cursor either in orthogonal planes (horizontal and frontal, respectively) or we aligned them in the horizontal plane. We did not find the expected enhanced integration strength in the latter spatial condition. As a secondary question we asked whether spatial transformations required for the Position judgments (i.e., horizontal to frontal or vice versa) could be the origin of previously observed suboptimal variances of the integrated hand and Cursor Position judgments. We found, however, that the suboptimality persisted when spatial transformations were omitted (i.e., with the hand and Cursor in the same plane). Our findings thus clearly show that the integration of actions with their visual effects is, at least for Cursor control, independent of spatial proximity