Heading Direction

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 8994 Experts worldwide ranked by ideXlab platform

Dongkyoung Chwa - One of the best experts on this subject based on the ideXlab platform.

  • coupled multiple sliding mode control for robust trajectory tracking of hovercraft with external disturbances
    IEEE Transactions on Industrial Electronics, 2018
    Co-Authors: Seongchan Jeong, Dongkyoung Chwa
    Abstract:

    This paper proposes a robust coupled multiple sliding-mode control (CMSMC) method for tracking control of underactuated hovercraft systems with nonholonomic constraints and external disturbances. First, a friction model for the hovercraft, one of the main disturbance factors, is proposed by considering viscosity, and its validity is demonstrated through experiments. Second, a disturbance model in the actual hovercraft system is estimated using a least-square-estimation-based disturbance observer. Third, pseudo forces and pseudo Heading Direction angle for tracking control and disturbance compensation are proposed considering the characteristics of underactuated hovercraft systems. Fourth, coupled multiple sliding surfaces (CMSSs) are newly introduced in terms of the tracking errors between the pseudo control variables and actual ones, and then, a CMSMC-based controller is proposed so that the CMSSs converge to zero within finite time in the case of zero disturbance estimation errors. In this way, three posture variables of the hovercraft converge to reference ones using only two control inputs. Finally, stability analysis and verification by simulations and experiments show that both the pseudo control tracking errors and posture tracking errors are ultimately bounded and asymptotically converge to zero when disturbance estimation errors become zero.

  • tracking control of differential drive wheeled mobile robots using a backstepping like feedback linearization
    Systems Man and Cybernetics, 2010
    Co-Authors: Dongkyoung Chwa
    Abstract:

    This paper proposes a tracking control method for differential-drive wheeled mobile robots with nonholonomic constraints by using a backstepping-like feedback linearization. Unlike previous backstepping controllers for wheeled mobile robots, a backstepping-like feedback control structure is proposed in the form of a cascaded kinematic and dynamic linearization to have a simpler and modular control structure. First, the pseudo commands for the forward linear velocity and the Heading Direction angle are designed based on kinematics. Then, the actual torque control inputs are designed to make the actual forward linear velocity and Heading Direction angle follow their corresponding pseudo commands. A stability analysis shows that the tracking errors of the posture (the position and Heading Direction angle) are globally ultimately bounded and its ultimate bound can be adjusted by the proper choice of control parameters. In addition, numerical simulations for various reference trajectories (e.g., a straight line, a circle, a sinusoidal curve, a spinning trajectory with no forward velocity and a nonzero rotational velocity, a cross-shaped trajectory changing the forward and backward Directions, etc.) show the validity of the proposed scheme.

  • sliding mode tracking control of nonholonomic wheeled mobile robots in polar coordinates
    IEEE Transactions on Control Systems and Technology, 2004
    Co-Authors: Dongkyoung Chwa
    Abstract:

    This brief proposes a sliding-mode control method for wheeled-mobile robots in polar coordinates. A new sliding-mode control method is proposed for mobile robots with kinematics in two-dimensional polar coordinates. In the proposed method, two controllers are designed to asymptotically stabilize the tracking errors in position and Heading Direction, respectively. By combining these controllers together, both asymptotic posture (position and Heading Direction) stabilization and trajectory tracking are achieved for reference trajectories at global regions except the arbitrary small region around the origin. In particular, constraints on the desired linear and angular velocities as well as the posture of the mobile robot are eliminated unlike the previous studies based on kinematics expressed in polar coordinates. Accordingly, arbitrary trajectories including a circle and a straight line in various forms can be followed even with large initial tracking errors and bounded disturbances. The stability and performance analyzes are performed and also simulations are included to confirm the effectiveness of the proposed scheme.

Charles J Duffy - One of the best experts on this subject based on the ideXlab platform.

Wenhao Zhang - One of the best experts on this subject based on the ideXlab platform.

  • complementary congruent and opposite neurons achieve concurrent multisensory integration and segregation
    eLife, 2019
    Co-Authors: Tai Sing Lee, Wenhao Zhang, He Wang, Aihua Chen, K Michael Y Wong
    Abstract:

    Our brain perceives the world by exploiting multisensory cues to extract information about various aspects of external stimuli. The sensory cues from the same stimulus should be integrated to improve perception, and otherwise segregated to distinguish different stimuli. In reality, however, the brain faces the challenge of recognizing stimuli without knowing in advance the sources of sensory cues. To address this challenge, we propose that the brain conducts integration and segregation concurrently with complementary neurons. Studying the inference of Heading-Direction via visual and vestibular cues, we develop a network model with two reciprocally connected modules modeling interacting visual-vestibular areas. In each module, there are two groups of neurons whose tunings under each sensory cue are either congruent or opposite. We show that congruent neurons implement integration, while opposite neurons compute cue disparity information for segregation, and the interplay between two groups of neurons achieves efficient multisensory information processing.

  • concurrent multisensory integration and segregation with complementary congruent and opposite neurons
    bioRxiv, 2018
    Co-Authors: Tai Sing Lee, Wenhao Zhang, He Wang, Aihua Chen, K Michael Y Wong
    Abstract:

    Abstract Our brain perceives the world by exploiting multiple sensory modalities to extract information about various aspects of external stimuli. If these sensory cues are from the same stimulus of interest, they should be integrated to improve perception; otherwise, they should be segregated to distinguish different stimuli. In reality, however, the brain faces the challenge of recognizing stimuli without knowing in advance whether sensory cues come from the same or different stimuli. To address this challenge and to recognize stimuli rapidly, we argue that the brain should carry out multisensory integration and segregation concurrently with complementary neuron groups. Studying an example of inferring Heading-Direction via visual and vestibular cues, we develop a concurrent multisensory processing neural model which consists of two reciprocally connected modules, the dorsal medial superior temporal area (MSTd) and the ventral intraparietal area (VIP), and that at each module, there exists two distinguishing groups of neurons, congruent and opposite neurons. Specifically, congruent neurons implement cue integration, while opposite neurons compute the cue disparity, both optimally as described by Bayesian inference. The two groups of neurons provide complementary information which enables the neural system to assess the validity of cue integration and, if necessary, to recover the lost information associated with individual cues without re-gathering new inputs. Through this process, the brain achieves rapid stimulus perception if the cues come from the same stimulus of interest, and differentiates and recognizes stimuli based on individual cues with little time delay if the cues come from different stimuli of interest. Our study unveils the indispensable role of opposite neurons in multisensory processing and sheds light on our understanding of how the brain achieves multisensory processing efficiently and rapidly. Significance Statement Our brain perceives the world by exploiting multiple sensory cues. These cues need to be integrated to improve perception if they come from the same stimulus and otherwise be segregated. To address the challenge of recognizing whether sensory cues come from the same or different stimuli that are unknown in advance, we propose that the brain should carry out multisensory integration and segregation concurrently with two different neuron groups. Specifically, congruent neurons implement cue integration, while opposite neurons compute the cue disparity, and the interplay between them achieves rapid stimulus recognition without information loss. We apply our model to the example of inferring Heading-Direction based on visual and vestibular cues and reproduce the experimental data successfully.

  • decentralized multisensory information integration in neural systems
    The Journal of Neuroscience, 2016
    Co-Authors: Wenhao Zhang, Aihua Chen, Malte J Rasch
    Abstract:

    How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer Heading Direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. SIGNIFICANCE STATEMENT To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of Heading Direction with visual and vestibular cues, we show that the decentralized system can integrate information optimally, with the reciprocal connections between processers determining the extent of cue integration. Our model reproduces known multisensory integration behaviors observed in experiments and sheds new light on our understanding of how information is integrated in the brain.

Eric R Kandel - One of the best experts on this subject based on the ideXlab platform.

  • Heading Direction with respect to a reference point modulates place cell activity
    Nature Communications, 2019
    Co-Authors: Pablo E Jercog, Yashar Ahmadian, Caitlin Woodruff, Rajeev Debsen, L F Abbott, Eric R Kandel
    Abstract:

    The tuning of neurons in area CA1 of the hippocampus emerges through a combination of non-spatial input from different sensory modalities and spatial information about the animal’s position and Heading Direction relative to the spatial enclosure being navigated. The positional modulation of CA1 neuronal responses has been widely studied (e.g. place tuning), but less is known about the modulation of these neurons by Heading Direction. Here, utilizing electrophysiological recordings from CA1 pyramidal cells in freely moving mice, we report that a majority of neural responses are modulated by the Heading-Direction of the animal relative to a point within or outside their enclosure that we call a reference point. The finding of Heading-Direction modulation relative to reference points identifies a novel representation encoded in the neuronal responses of the dorsal hippocampus. Place cells are neurons in the hippocampus which encode an animal’s location in space. Here, in mice, the authors show that place cell activity is also modulated by the Heading-Direction of the animal relative to a particular “reference point” that can be either within or outside their enclosure.

  • Heading Direction with respect to a reference point modulates place cell activity
    bioRxiv, 2018
    Co-Authors: Pablo E Jercog, Yashar Ahmadian, Caitlin Woodruff, Rajeev Debsen, L F Abbott, Eric R Kandel
    Abstract:

    Utilizing electrophysiological recordings from CA1 pyramidal cells in freely moving mice, we find that a majority of neural responses are modulated by the Heading-Direction of the animal relative to a point within or outside their enclosure that we call a reference point. Our findings identify a novel representation in the neuronal responses in the dorsal hippocampus

William K Page - One of the best experts on this subject based on the ideXlab platform.