Visual Environment

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 324 Experts worldwide ranked by ideXlab platform

C C A M Gielen - One of the best experts on this subject based on the ideXlab platform.

  • frequency dependence of the action perception cycle for postural control in a moving Visual Environment relative phase dynamics
    Biological Cybernetics, 1994
    Co-Authors: Tjeerd M H Dijkstra, Gregor Schoner, Martin A Giese, C C A M Gielen
    Abstract:

    When standing human subjects are exposed to a moving Visual Environment, the induced postural sway displays varying degrees of coherence with the Visual information. In our experiment we varied the frequency of an oscillatory Visual display and analysed the temporal relationship between Visual motion and sway. We found that subjects maintain sizeable sway amplitudes even as temporal coherence with the display is lost. Postural sway tended to phase lead (for frequencies below 0.2 Hz) or phase lag (above 0.3 Hz). However, we also observed at a fixed frequency, highly variable phase relationships in which a preferred range of phase lags is prevalent, but phase jumps occur that return the system into the preferred range after phase has begun drifting out of the preferred regime. By comparing the results quantitatively with a dynamical model (the sine-circle map), we show that this effect can be understood as a form of relative coordination and arises through an instability of the dynamics of the action-perception cycle. Because such instabilities cannot arise in passively driven systems, we conclude that postural sway in this situation is actively generated as rhythmic movement which is coupled dynamically to the Visual motion.

  • temporal stability of the action perception cycle for postural control in a moving Visual Environment
    Experimental Brain Research, 1994
    Co-Authors: Tjeerd M H Dijkstra, Gregor Schoner, C C A M Gielen
    Abstract:

    When standing human subjects are exposed to a moving Visual Environment, the induced postural sway forms a stable temporal relationship with the Visual information. We have investigated this relationship experimentally with a new set-up in which a computer generates video images which correspond to the motion of a 3D Environment. The suggested mean distance to a sinusoidally moving wall is varied and the temporal relationship to induced sway is analysed (1) in terms of the fluctuations of relative phase between Visual and sway motion and (2) in terms of the relaxation time of relative phase as determined from the rate of recovery of the stable relative phase pattern following abrupt changes in the Visual motion pattern. The two measures are found to converge to a well-defined temporal stability of the action-perception cycle. Furthermore, we show that this temporal stability is a sensitive measure of the strength of the action-perception coupling. It decreases as the distance of the Visual scene from the observer increases. This fact and the increase of mean relative phase are consistent with predictions of a linear second-order system driven by the Visual expansion rate. However, the amplitude of Visual sway decreases little as Visual distance increases, in contradiction to the predictions, and is suggestive of a process that actively generates sway. The Visual expansion rate on the optic array is found to decrease strongly with Visual distance. This leads to the conclusion that postural control in a moving Visual Environment cannot be understood simply in terms of minimization of retinal slip, and that dynamic coupling of vision into the postural control system must be taken into account.

Tjeerd M H Dijkstra - One of the best experts on this subject based on the ideXlab platform.

  • frequency dependence of the action perception cycle for postural control in a moving Visual Environment relative phase dynamics
    Biological Cybernetics, 1994
    Co-Authors: Tjeerd M H Dijkstra, Gregor Schoner, Martin A Giese, C C A M Gielen
    Abstract:

    When standing human subjects are exposed to a moving Visual Environment, the induced postural sway displays varying degrees of coherence with the Visual information. In our experiment we varied the frequency of an oscillatory Visual display and analysed the temporal relationship between Visual motion and sway. We found that subjects maintain sizeable sway amplitudes even as temporal coherence with the display is lost. Postural sway tended to phase lead (for frequencies below 0.2 Hz) or phase lag (above 0.3 Hz). However, we also observed at a fixed frequency, highly variable phase relationships in which a preferred range of phase lags is prevalent, but phase jumps occur that return the system into the preferred range after phase has begun drifting out of the preferred regime. By comparing the results quantitatively with a dynamical model (the sine-circle map), we show that this effect can be understood as a form of relative coordination and arises through an instability of the dynamics of the action-perception cycle. Because such instabilities cannot arise in passively driven systems, we conclude that postural sway in this situation is actively generated as rhythmic movement which is coupled dynamically to the Visual motion.

  • temporal stability of the action perception cycle for postural control in a moving Visual Environment
    Experimental Brain Research, 1994
    Co-Authors: Tjeerd M H Dijkstra, Gregor Schoner, C C A M Gielen
    Abstract:

    When standing human subjects are exposed to a moving Visual Environment, the induced postural sway forms a stable temporal relationship with the Visual information. We have investigated this relationship experimentally with a new set-up in which a computer generates video images which correspond to the motion of a 3D Environment. The suggested mean distance to a sinusoidally moving wall is varied and the temporal relationship to induced sway is analysed (1) in terms of the fluctuations of relative phase between Visual and sway motion and (2) in terms of the relaxation time of relative phase as determined from the rate of recovery of the stable relative phase pattern following abrupt changes in the Visual motion pattern. The two measures are found to converge to a well-defined temporal stability of the action-perception cycle. Furthermore, we show that this temporal stability is a sensitive measure of the strength of the action-perception coupling. It decreases as the distance of the Visual scene from the observer increases. This fact and the increase of mean relative phase are consistent with predictions of a linear second-order system driven by the Visual expansion rate. However, the amplitude of Visual sway decreases little as Visual distance increases, in contradiction to the predictions, and is suggestive of a process that actively generates sway. The Visual expansion rate on the optic array is found to decrease strongly with Visual distance. This leads to the conclusion that postural control in a moving Visual Environment cannot be understood simply in terms of minimization of retinal slip, and that dynamic coupling of vision into the postural control system must be taken into account.

B. N. Smetanin - One of the best experts on this subject based on the ideXlab platform.

  • Influence of a light tactile contact on vertical posture maintenance under the conditions of destabilization of Visual Environment
    Human Physiology, 2015
    Co-Authors: G. V. Kozhina, Yu. S. Levik, B. N. Smetanin
    Abstract:

    The influence of a light contact between index finger and a stationary external surface on the maintenance of upright posture in healthy subjects “immersed” in unstable virtual Visual Environment has been studied. Under these conditions, the subjects saw a screen with a Visual scene consisting of a foreground and a background. In the foreground, there was a window of a room with the adjacent walls; in the background, there was an aqueduct with the adjacent terrain. The virtual Visual Environment was destabilized by setting inphase or antiphase couplings between the foreground and body oscillations. The analysis of upright posture maintenance was focused on the assessment of amplitude–frequency characteristics of two elementary variables calculated from the trajectories of the center of pressure of feet (CoP) in mediolateral and anteroposterior directions: the trajectory of vertical projection of the center of gravity (the CG variable) and the differences between the CoP and CG trajectories (the CoP-CG variable). Both in case of normal posture and the posture with a fingertip contact, the root mean square (RMS) values of the spectra of both variables were the lowest in motionless Visual Environment with antiphase coupling between the foreground and the body oscillations and the highest with inphase coupling and with eyes closed. In the cases with fingertip contact, the intensity of body oscillations in both directions was considerably lower; the influence of different Visual conditions on RMS values of the spectra of both variables decreased. This effect was more significant for the CG variable. The frequency of body oscillations decreased as well. We observed the effect of tactile contact on the frequency of the spectra of both variables. The median frequencies of the spectra of the CoP-CG variable calculated from body oscillations in the anteroposterior and mediolateral directions increased under the conditions of tactile contact. On the contrary, the median frequencies of the spectra of the CG variable increased only for body oscillations in the mediolateral direction. Our results show that a light tactile contact (providing no mechanical support) significantly improves vertical posture maintenance, inter alia, under the conditions of destabilization of virtual Visual Environment. This improvement is provided by multidirectional and independent effects on the amplitude–frequency characteristics of elementary variables (CG and CoP-CG).

  • The Influence of Light Tactile Contact on the Maintenance of Vertical Posture under the Conditions of Destabilization of the Visual Environment
    Fiziologiia cheloveka, 2015
    Co-Authors: G. V. Kozhina, Yu. S. Levik, B. N. Smetanin
    Abstract:

    We studied the influence of a light contact of the index finger with a stationary surface of external Environment on the maintenance of upright posture in healthy subjects "immersed" in unstable virtual Visual Environment. Under these conditions, the subjects saw the screen with a Visual scene consisting of foreground and background. The foreground represented a window of a room with adjoining walls; and the background plan--an aqueduct with the adjacent terrain. The virtual Visual Environment was destabilized by setting inphase or antiphase links between the foreground and the oscillations of body. The analysis of the maintenance of upright posture was focused on the assessment of amplitude-frequency characteristics of two elementary variables calculated from trajectories of the center of pressure of feet (CoP) in mediolateral and anteroposterior directions: trajectory of the vertical projection of the center of gravity (CG variable) and differences between the trajectories of CoP and CG (CoP-CG variable). Both in normal posture and in posture with fingertip contact, the root mean square (RMS) values of the spectra of both variables were lowest in motionless Visual Environment with antiphase link of the foreground with body oscillations; highest, with inphase link and with eyes closed. In cases with fingertip contact, the intencity of body oscillations in both directions were considerably lower; the influence of different Visual conditions on RMS values of the spectra of both variables decreased. This effect was more significant for CG variable. The frequency of body ascillations decreased as well. We observed the effect of tactile contacst on the frequency of the spectra of both variables. The median frequencies of the spectra of CoP-CG variable calculated from body oscillations in anteroposterior and mediolateral directions increased under the conditions of a tactile contact. On the contrary, the median frequencies of the spectra of CG variable increased only in body ascillations in mediolateral direction. Our results showed that a light tactile contact (providing no mechanical support) significantly improves the maintenance of vertical posture, including under conditions of destabilization of the virtual Visual Environment. This improvement is provided by multidirectional and independent effects on the amplitude-frequency characteristics of elementary variables (CG and CoP-CG).

  • Human upright posture control in a virtual Visual Environment
    Human Physiology, 2009
    Co-Authors: B. N. Smetanin, G. V. Kozhina, A. K. Popov
    Abstract:

    The sagittal and frontal components of the stabilogram were monitored in 14 healthy subjects standing on a rigid or pliant support under three different conditions of Visual control: with the eyes opened (EO), with the eyes closed (EC), or in a virtual Visual Environment (VVE). Under the VVE conditions, the subjects looked at a three-dimensional image of elements of a room (a 3-D artificial room) that was generated by a computer and locked to the fluctuations of the body center of gravity (CG) so that the Visual connection between body sway and shifts of the Visual Environment typical of normal Visual conditions was reproduced. Frequency filtration of the fluctuations of the foot’s center of pressure (FCP) was used to isolate the movements of the vertical projection of the CG and determine the difference between these two variables. The changes in the variables (CG and FCP-CG) were estimated using spectral analysis followed by the calculation of the root mean square (RMS) amplitudes of their spectral fluctuations. In subjects standing on a rigid support, the RMS amplitudes of the spectra of both variables were the highest under the VVE and EC conditions and the lowest under the EO conditions. In subjects standing on a pliant support, body sway was considerably enhanced, which was accompanied by a different pattern of Visual influences. The RMS values were the highest under the EC conditions and were lower by a factor of 2–2.5 under the EO and VVE conditions. Thus, it has been demonstrated that the cerebral structures controlling posture ignore the afferent input from the eyes under VVE conditions, if the subject is standing on a rigid support and the CG fluctuations are relatively small; however, this afferentation is efficiently used for maintaining the posture on a pliable support, when the body sway is substantially enhanced.

Jocelyn Faubert - One of the best experts on this subject based on the ideXlab platform.

  • Postural hypo-reactivity in autism is contingent on development and Visual Environment: A fully immersive virtual reality study
    Journal of Autism and Developmental Disorders, 2012
    Co-Authors: Selma Greffou, Eva Maria Hahler, Jean Marie Hanssens, Laurent Mottron, Armando Bertone, Jocelyn Faubert
    Abstract:

    Although atypical motor behaviors have been associated with autism, investigations regarding their possible origins are scarce. This study assessed the Visual and vestibular components involved in atypical postural reactivity in autism. Postural reactivity and stability were measured for younger (12-15 years) and older (16-33 years) autistic participants in response to a virtual tunnel oscillating at different frequencies. At the highest oscillation frequency, younger autistic participants showed significantly less instability compared to younger typically-developing participants; no such group differences were evidenced for older participants. Additionally, no significant differences in postural behavior were found between all 4 groups when presented with static or without Visual information. Results confirm that postural hypo-reactivity to Visual information is present in autism, but is contingent on both Visual Environment and development.

Bernhard Laback - One of the best experts on this subject based on the ideXlab platform.

  • 3 d localization of virtual sound sources effects of Visual Environment pointing method and training
    Attention Perception & Psychophysics, 2010
    Co-Authors: Piotr Majdak, Matthew J. Goupell, Bernhard Laback
    Abstract:

    The ability to localize sound sources in three-dimensional space was tested in humans. In Experiment 1, naive subjects listened to noises filtered with subject-specific head-related transfer functions. The tested conditions included the pointing method (head or manual pointing) and the Visual Environment (VE; darkness or virtual VE). The localization performance was not significantly different between the pointing methods. The virtual VE significantly improved the horizontal precision and reduced the number of front-back confusions. These results show the benefit of using a virtual VE in sound localization tasks. In Experiment 2, subjects were provided with sound localization training. Over the course of training, the performance improved for all subjects, with the largest improvements occurring during the first 400 trials. The improvements beyond the first 400 trials were smaller. After the training, there was still no significant effect of pointing method, showing that the choice of either head- or manual-pointing method plays a minor role in sound localization performance. The results of Experiment 2 reinforce the importance of perceptual training for at least 400 trials in sound localization studies.

  • The Accuracy of Localizing Virtual Sound Sources: Effects of Pointing Method and Visual Environment
    Journal of The Audio Engineering Society, 2008
    Co-Authors: Matthew J. Goupell, Bernhard Laback, Piotr Majdak, Michael Mihocic
    Abstract:

    The ability to localize sound sources in 3D-space was tested in humans. The subjects listened to noises filtered with subject-specific head-related transfer functions. In the experiment using naive subjects, the conditions included the type of Visual Environment (darkness or structured virtual world) presented via head mounted display and pointing method (head and manual pointing). The results show that the errors in the horizontal dimension were smaller when head pointing was used. Manual pointing showed smaller errors in the vertical dimension. Generally, the effect of pointing method was significant but small. The presence of structured virtual Visual Environment significantly im­ proved the localization accuracy in all conditions. This supports the benefit of using a Visual virtual Environment in acoustic tasks like sound localization. AES Majdak et al. Localization of Sounds: Pointing Methods and Visual Environment