Tactile Perception

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 15711 Experts worldwide ranked by ideXlab platform

Xinyu Chai - One of the best experts on this subject based on the ideXlab platform.

  • Estimation of simulated phosphene size based on Tactile Perception.
    Artificial organs, 2011
    Co-Authors: Panpan Chen, Ying Zhao, Jingru Shi, Qiushi Ren, Xinyu Chai
    Abstract:

    Clinical trials have successfully shown that a visual prosthesis can elicit visual Perception (phosphenes) in the visual field. Psychophysical studies based on simulated prosthetic vision offer an effective means to evaluate and refine prosthetic vision. We designed three experiments to examine the effect of phosphene luminance, flicker rate, and eccentricity on the ability to estimate simulated phosphene sizes using Tactile Perception. Thirty subjects participated in the three experiments. There was a linear increase in reported size as visual stimulus size increased. Judgment was significantly affected by stimulus luminance and eccentricity (P < 0.05) but not by flicker rates. Brighter stimuli were perceived as being larger, and the more eccentric the position, the larger the estimated size. These simulation studies, although idealized, suggested that Tactile Perception is a potential way to estimate phosphene sizes.

  • Study of Simulated Phosphene Size Based on Tactile Perception with Three Distributions
    2009 3rd International Conference on Bioinformatics and Biomedical Engineering, 2009
    Co-Authors: Shu Ling, Panpan Chen, Ying Zhao, Qiushi Ren, Cong Dai, Jin Fan, Xinyu Chai
    Abstract:

    This paper addresses a method of simulated phosphene's size estimation based on Tactile Perception. To evaluate the size matching of the simulated phosphenes, subjects are in a simulated artificial visual environment and use a size matching Tactile board to match the size with the phosphenes that they observe. Three phosphene distributions (Uniform, Gaussian I and Gaussian II) of the simulated phosphene are studied. And the experiment results indicate the difference in the three distributions.

  • Study of Tactile Perception based on phosphene positioning using simulated prosthetic vision.
    Artificial organs, 2008
    Co-Authors: Xinyu Chai, Leilei Zhang, Feng Shao, Kun Yang, Qiushi Ren
    Abstract:

    In recent years, as stimulation electrodes have been implanted in the visual cortex, optic nerve, and retina to generate visual Perceptions (phosphenes), the research on prosthetic vision has become a popular topic. After implantation, it is crucial to evaluate the characteristics of the stimulated phosphenes. Until now, several methods using Tactile Perception are proposed to describe the phosphene position, but no systematic study of the Perceptional behavior has been performed. Here, an experimental study of Tactile Perception based on phosphene positioning was proposed using simulated prosthetic vision. Results show that the dispersion was smaller and the response time was less when phosphenes are generated in near visual field compared to the far visual field. The dispersion, the accuracy, and the response speed were better when using the visual guide. Moreover, the widely used method of using the left hand as reference and the right hand to point the phosphene may cause geographic error.

Qiushi Ren - One of the best experts on this subject based on the ideXlab platform.

  • Estimation of simulated phosphene size based on Tactile Perception.
    Artificial organs, 2011
    Co-Authors: Panpan Chen, Ying Zhao, Jingru Shi, Qiushi Ren, Xinyu Chai
    Abstract:

    Clinical trials have successfully shown that a visual prosthesis can elicit visual Perception (phosphenes) in the visual field. Psychophysical studies based on simulated prosthetic vision offer an effective means to evaluate and refine prosthetic vision. We designed three experiments to examine the effect of phosphene luminance, flicker rate, and eccentricity on the ability to estimate simulated phosphene sizes using Tactile Perception. Thirty subjects participated in the three experiments. There was a linear increase in reported size as visual stimulus size increased. Judgment was significantly affected by stimulus luminance and eccentricity (P < 0.05) but not by flicker rates. Brighter stimuli were perceived as being larger, and the more eccentric the position, the larger the estimated size. These simulation studies, although idealized, suggested that Tactile Perception is a potential way to estimate phosphene sizes.

  • Study of Simulated Phosphene Size Based on Tactile Perception with Three Distributions
    2009 3rd International Conference on Bioinformatics and Biomedical Engineering, 2009
    Co-Authors: Shu Ling, Panpan Chen, Ying Zhao, Qiushi Ren, Cong Dai, Jin Fan, Xinyu Chai
    Abstract:

    This paper addresses a method of simulated phosphene's size estimation based on Tactile Perception. To evaluate the size matching of the simulated phosphenes, subjects are in a simulated artificial visual environment and use a size matching Tactile board to match the size with the phosphenes that they observe. Three phosphene distributions (Uniform, Gaussian I and Gaussian II) of the simulated phosphene are studied. And the experiment results indicate the difference in the three distributions.

  • Study of Tactile Perception based on phosphene positioning using simulated prosthetic vision.
    Artificial organs, 2008
    Co-Authors: Xinyu Chai, Leilei Zhang, Feng Shao, Kun Yang, Qiushi Ren
    Abstract:

    In recent years, as stimulation electrodes have been implanted in the visual cortex, optic nerve, and retina to generate visual Perceptions (phosphenes), the research on prosthetic vision has become a popular topic. After implantation, it is crucial to evaluate the characteristics of the stimulated phosphenes. Until now, several methods using Tactile Perception are proposed to describe the phosphene position, but no systematic study of the Perceptional behavior has been performed. Here, an experimental study of Tactile Perception based on phosphene positioning was proposed using simulated prosthetic vision. Results show that the dispersion was smaller and the response time was less when phosphenes are generated in near visual field compared to the far visual field. The dispersion, the accuracy, and the response speed were better when using the visual guide. Moreover, the widely used method of using the left hand as reference and the right hand to point the phosphene may cause geographic error.

Shan Luo - One of the best experts on this subject based on the ideXlab platform.

  • touching to see and seeing to feel robotic cross modal sensory data generation for visual Tactile Perception
    International Conference on Robotics and Automation, 2019
    Co-Authors: Jet-tsyn Lee, Danushka Bollegala, Shan Luo
    Abstract:

    The integration of visual-Tactile stimulus is common while humans performing daily tasks. In contrast, using unimodal visual or Tactile Perception limits the perceivable dimensionality of a subject. However, it remains a challenge to integrate the visual and Tactile Perception to facilitate robotic tasks. In this paper, we propose a novel framework for the cross-modal sensory data generation for visual and Tactile Perception. Taking texture Perception as an example, we apply conditional generative adversarial networks to generate pseudo visual images or Tactile outputs from data of the other modality. Extensive experiments on the ViTac dataset of cloth textures show that the proposed method can produce realistic outputs from other sensory inputs. We adopt the structural similarity index to evaluate similarity of the generated output and real data and results show that realistic data have been generated. Classification evaluation has also been performed to show that the inclusion of generated data can improve the Perception performance. The proposed framework has potential to expand datasets for classification tasks, generate sensory outputs that are not easy to access, and also advance integrated visual-Tactile Perception.

  • "Touching to See" and "Seeing to Feel": Robotic Cross-modal SensoryData Generation for Visual-Tactile Perception
    arXiv: Robotics, 2019
    Co-Authors: Jet-tsyn Lee, Danushka Bollegala, Shan Luo
    Abstract:

    The integration of visual-Tactile stimulus is common while humans performing daily tasks. In contrast, using unimodal visual or Tactile Perception limits the perceivable dimensionality of a subject. However, it remains a challenge to integrate the visual and Tactile Perception to facilitate robotic tasks. In this paper, we propose a novel framework for the cross-modal sensory data generation for visual and Tactile Perception. Taking texture Perception as an example, we apply conditional generative adversarial networks to generate pseudo visual images or Tactile outputs from data of the other modality. Extensive experiments on the ViTac dataset of cloth textures show that the proposed method can produce realistic outputs from other sensory inputs. We adopt the structural similarity index to evaluate similarity of the generated output and real data and results show that realistic data have been generated. Classification evaluation has also been performed to show that the inclusion of generated data can improve the Perception performance. The proposed framework has potential to expand datasets for classification tasks, generate sensory outputs that are not easy to access, and also advance integrated visual-Tactile Perception.

  • ICRA - “Touching to See” and “Seeing to Feel”: Robotic Cross-modal Sensory Data Generation for Visual-Tactile Perception
    2019 International Conference on Robotics and Automation (ICRA), 2019
    Co-Authors: Jet-tsyn Lee, Danushka Bollegala, Shan Luo
    Abstract:

    The integration of visual-Tactile stimulus is common while humans performing daily tasks. In contrast, using unimodal visual or Tactile Perception limits the perceivable dimensionality of a subject. However, it remains a challenge to integrate the visual and Tactile Perception to facilitate robotic tasks. In this paper, we propose a novel framework for the cross-modal sensory data generation for visual and Tactile Perception. Taking texture Perception as an example, we apply conditional generative adversarial networks to generate pseudo visual images or Tactile outputs from data of the other modality. Extensive experiments on the ViTac dataset of cloth textures show that the proposed method can produce realistic outputs from other sensory inputs. We adopt the structural similarity index to evaluate similarity of the generated output and real data and results show that realistic data have been generated. Classification evaluation has also been performed to show that the inclusion of generated data can improve the Perception performance. The proposed framework has potential to expand datasets for classification tasks, generate sensory outputs that are not easy to access, and also advance integrated visual-Tactile Perception.

  • Robotic Tactile Perception of object properties: A review
    Mechatronics, 2017
    Co-Authors: Shan Luo, Joao Bimbo, Ravinder S. Dahiya, Hongbin Liu
    Abstract:

    Touch sensing can help robots understand their surrounding environment, and in particular the objects they interact with. To this end, roboticists have, in the last few decades, developed several Tactile sensing solutions, extensively reported in the literature. Research into interpreting the conveyed Tactile information has also started to attract increasing attention in recent years. However, a comprehensive study on this topic is yet to be reported. In an effort to collect and summarize the major scientific achievements in the area, this survey extensively reviews current trends in robot Tactile Perception of object properties. Available Tactile sensing technologies are briefly presented before an extensive review on Tactile recognition of object properties. The object properties that are targeted by this review are shape, surface material and object pose. The role of touch sensing in combination with other sensing sources is also discussed. In this review, open issues are identified and future directions for applying Tactile sensing in different tasks are suggested.

Cornelius Schwarz - One of the best experts on this subject based on the ideXlab platform.

  • The Slip Hypothesis: Tactile Perception and its Neuronal Bases
    Trends in neurosciences, 2016
    Co-Authors: Cornelius Schwarz
    Abstract:

    The slip hypothesis of epicritic Tactile Perception interprets actively moving sensor and touched objects as a frictional system, known to lead to jerky relative movements called 'slips'. These slips depend on object geometry, forces, material properties, and environmental factors, and, thus, have the power to incorporate coding of the perceptual target, as well as perceptual strategies (sensor movement). Tactile information as transferred by slips will be encoded discontinuously in space and time, because slips sometimes engage only parts of the touching surfaces and appear as discrete and rare events in time. This discontinuity may have forced Tactile systems of vibrissae and fingertips to evolve special ways to convert touch signals to a Tactile percept.

  • Support for the slip hypothesis from whisker-related Tactile Perception of rats in a noisy environment.
    Frontiers in integrative neuroscience, 2015
    Co-Authors: Christian Waiblinger, Dominik Brugger, Clarissa J. Whitmire, Garrett B. Stanley, Cornelius Schwarz
    Abstract:

    Rodents use active whisker movements to explore their environment. The ‘slip hypothesis’ of whisker-related Tactile Perception entails that short-lived kinematic events (abrupt whisker movements, called ‘slips’, due to bioelastic whisker properties that occur during active touch of textures) carry the decisive texture information. Supporting this hypothesis, previous studies have shown that slip amplitude and frequency occur in a texture dependent way. Further, experiments employing passive pulsatile whisker deflections revealed that perceptual performance based on pulse kinematics (i.e. signatures that resemble slips) is far superior to the one based on time-integrated variables like frequency and intensity. So far, pulsatile stimuli were employed in a noise free environment. However, the realistic scenario involves background noise (e.g. evoked by rubbing across the texture). Therefore, if slips are used for Tactile Perception, the Tactile neuronal system would need to differentiate slip-evoked spikes from those evoked by noise. To test the animals under these more realistic conditions, we presented passive whisker-deflections to head-fixed trained rats, consisting of 'slip-like' events (waveforms mimicking slips occurring with touch of real textures) embedded into background noise. Varying the i) shapes (ramp or pulse), ii) kinematics (amplitude, velocity, etc.), and iii) the probabilities of occurrence of slip-like events, we observed that rats could readily detect slip-like events of different shapes against noisy background. Psychophysical curves revealed that the difference of slip event and noise amplitude determined Perception, while increased probability of occurrence (frequency) had barely any effect. These results strongly support the notion that encoding of kinematics dominantly determines whisker-related Tactile Perception while the computation of frequency or intensity plays a minor role.

Panpan Chen - One of the best experts on this subject based on the ideXlab platform.

  • Estimation of simulated phosphene size based on Tactile Perception.
    Artificial organs, 2011
    Co-Authors: Panpan Chen, Ying Zhao, Jingru Shi, Qiushi Ren, Xinyu Chai
    Abstract:

    Clinical trials have successfully shown that a visual prosthesis can elicit visual Perception (phosphenes) in the visual field. Psychophysical studies based on simulated prosthetic vision offer an effective means to evaluate and refine prosthetic vision. We designed three experiments to examine the effect of phosphene luminance, flicker rate, and eccentricity on the ability to estimate simulated phosphene sizes using Tactile Perception. Thirty subjects participated in the three experiments. There was a linear increase in reported size as visual stimulus size increased. Judgment was significantly affected by stimulus luminance and eccentricity (P < 0.05) but not by flicker rates. Brighter stimuli were perceived as being larger, and the more eccentric the position, the larger the estimated size. These simulation studies, although idealized, suggested that Tactile Perception is a potential way to estimate phosphene sizes.

  • Study of Simulated Phosphene Size Based on Tactile Perception with Three Distributions
    2009 3rd International Conference on Bioinformatics and Biomedical Engineering, 2009
    Co-Authors: Shu Ling, Panpan Chen, Ying Zhao, Qiushi Ren, Cong Dai, Jin Fan, Xinyu Chai
    Abstract:

    This paper addresses a method of simulated phosphene's size estimation based on Tactile Perception. To evaluate the size matching of the simulated phosphenes, subjects are in a simulated artificial visual environment and use a size matching Tactile board to match the size with the phosphenes that they observe. Three phosphene distributions (Uniform, Gaussian I and Gaussian II) of the simulated phosphene are studied. And the experiment results indicate the difference in the three distributions.