Body Language

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 11664 Experts worldwide ranked by ideXlab platform

Beatrice De Gelder - One of the best experts on this subject based on the ideXlab platform.

  • Instrumental Music Influences Recognition of Emotional Body Language
    Brain Topography: a Journal of Cerebral Function and Dynamics, 2009
    Co-Authors: Jan Van Den Stock, Isabelle Peretz, Julie Grèzes, Beatrice De Gelder
    Abstract:

    In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole Body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of Body Language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music.

  • huntington s disease impairs recognition of angry and instrumental Body Language
    Neuropsychologia, 2008
    Co-Authors: Beatrice De Gelder, Jan Van Den Stock, Ruth De Diego Balaguer, Annecatherine Bachoudlevi
    Abstract:

    Patients with Huntington's disease (HD) exhibit motor impairments as well as cognitive and emotional deficits. So far impairments in the ability to recognize emotional stimuli have mostly been investigated by using facial expressions and emotional voices. Other important emotional signals are provided by the whole Body. To investigate the impact of motor deficits on Body recognition and the relation between motor disorders and emotion perception deficits, we tested recognition of emotional Body Language (instrumental, angry, fearful and sad) in 19 HD patients and their matched controls with a nonverbal whole Body expression matching task. Results indicate that HD patients are impaired in recognizing both instrumental and angry whole Body postures. Furthermore, the Body Language perception deficits are correlated with measures of motor deficit. Taken together the results suggest a close relationship between emotion recognition (specifically anger) and motor abilities.

  • Seeing Fearful Body Language Overcomes Attentional Deficits in Patients with Neglect
    Journal of Cognitive Neuroscience, 2007
    Co-Authors: Marco Tamietto, Giuliano Geminiani, Rosanna Genero, Beatrice De Gelder
    Abstract:

    Survival depends to some extent on the ability to detect salient signals and prepare an appropriate response even when attention is engaged elsewhere. Fearful Body Language is a salient signal of imminent danger, easily observable from a distance and indicating to the observer which adaptive action to prepare for. Here we investigated for the first time whether fearful Body Language modulates the spatial distribution of attention and enhances visual awareness in neurological patients with severe attentional disorders. Patients with visual extinction and hemispatial neglect following right parietal injury have a rightward attentional bias accompanied by loss of awareness for contralesional left stimuli, especially when competing stimuli appear to the right. Three such patients were tested with pictures of fearful, happy, and neutral bodily expressions briefly presented either unilaterally in the left or right visual field, or to both fields simultaneously. On bilateral trials, unattended and task-irrelevant fearful bodily expressions modulated attentional selection and visual awareness. Fearful bodily expressions presented in the contralesional unattended visual field simultaneously with neutral bodies in the ipsilesional field were detected more often than left-side neutral or happy bodies. This demonstrates that despite pathological inattention and parietal damage, emotion and action-related information in fearful Body Language may be extracted automatically, biasing attentional selection and visual awareness. Our findings open new perspectives on the role of bodily expressions in attentional selection and suggest that a neural network in intact fronto-limbic and visual areas may still mediate reorienting of attention and preparation for action upon perceiving fear in others.

  • non conscious recognition of emotional Body Language
    Neuroreport, 2006
    Co-Authors: Beatrice De Gelder, Nouchine Hadjikhani
    Abstract:

    Patients with cortical blindness can reliably perceive some facial expressions even if they are unaware of their percept. We examined whether emotional Body Language may also be recognized in the absence of the primary visual cortex and without conscious stimulus perception. We presented emotional and neutral Body images in the blind ¢eld of a patient with unilateral striate cortex damage. Using functional magnetic resonance imaging, we measured activation following presentation to the blind hemi¢eld of whole Body images (happy, neutral) with the face blurred. Unseen happy Body images selectively activated area MT and the pulvinar nucleus of the thalamus, while unseen instrumental neutral Body images activated the premotor cortex. Our results show that in the absence of the striate cortex implicit bodily emotion perception may be possible. NeuroReport 17:583^586 � c 2006 Lippincott Williams & Wilkins.

  • Towards the neurobiology of emotional Body Language
    Nature Reviews Neuroscience, 2006
    Co-Authors: Beatrice De Gelder
    Abstract:

    People's faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional Body Language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-Body signals are automatically perceived and understood, and their role in emotional communication and decision-making.

Goldie Nejat - One of the best experts on this subject based on the ideXlab platform.

  • recognizing emotional Body Language displayed by a human like social robot
    International Journal of Social Robotics, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that Body Language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional Body Language for our human-like social robot, Brian 2.0. We develop emotional Body Language for the robot using a variety of Body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic Body Language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional Body Language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human Body Language features displayed by the robot with respect to the same Body Language features displayed by a human actor.

  • Recognizing Emotional Body Language Displayed by a Human-like Social Robot
    International Journal of Social Robotics, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Natural social human-robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that Body Language plays an important role in conveying information about changes in human emotions during human-human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional Body Language for our human-like social robot, Brian 2.0. We develop emotional Body Language for the robot using a variety of Body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic Body Language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot's emotional Body Language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human Body Language features displayed by the robot with respect to the same Body Language features displayed by a human actor. (PsycINFO Database Record (c) 2014 APA, all rights reserved) (journal abstract)

  • Determining the affective Body Language of older adults during socially assistive HRI
    2014 IEEE RSJ International Conference on Intelligent Robots and Systems, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Our research focuses on the development of a socially assistive robot to provide cognitive and social stimulation during meal-time scenarios in order to promote proper nutrition amongst the elderly. In this paper, we present the design of a novel automated affect recognition and classification system that will allow the robot to interpret natural displays of affective human Body Language during such one-on-one assistive scenarios. Namely, we identify appropriate Body Language features and learning-based classifiers that can be utilized for accurate affect estimation. A robot can then utilize this information in order to determine its own appropriate responsive behaviors to keep people engaged in this crucial activity. One-on-one assistive meal-time experiments were conducted with the robot Brian 2.1 and elderly participants at a long-term care facility. The results showed the potential of utilizing the automated affect recognition and classification system to identify and classify natural affective Body Language features into valence and arousal values using learning-based classifiers. The elderly users displayed a number of affective states, further motivating the use of the affect estimation system.

  • Affect detection from Body Language during social HRI
    Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, 2012
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    In order for robots to effectively engage a person in bi-directional social human-robot interaction (HRI), they need to be able to perceive and respond appropriately to a person's affective state. It has been shown that Body Language is essential in effectively communicating human affect. In this paper, we present an automated real-time Body Language recognition and classification system, utilizing the Microsoft® Kinect™ sensor, that determines a person's affect in terms of their accessibility (i.e., openness and rapport) towards a robot during natural one-on-one interactions. Social HRI experiments are presented with our human-like robot Brian 2.0 and a comparison study between our proposed system and one developed with the Kinect™ Body pose estimation algorithm verifies the performance of our affect classification system in HRI scenarios.

Shrikanth Narayanan - One of the best experts on this subject based on the ideXlab platform.

  • Analysis and Predictive Modeling of Body Language Behavior in Dyadic Interactions From Multimodal Interlocutor Cues
    IEEE Transactions on Multimedia, 2014
    Co-Authors: Zhaojun Yang, Angeliki Metallinou, Shrikanth Narayanan
    Abstract:

    During dyadic interactions, participants adjust their behavior and give feedback continuously in response to the behavior of their interlocutors and the interaction context. In this paper, we study how a participant in a dyadic interaction adapts his/her Body Language to the behavior of the interlocutor, given the interaction goals and context. We apply a variety of psychology-inspired Body Language features to describe Body motion and posture. We first examine the coordination between the dyad's behavior for two interaction stances: friendly and conflictive. The analysis empirically reveals the dyad's behavior coordination, and helps identify informative interlocutor features with respect to the participant's target Body Language features. The coordination patterns between the dyad's behavior are found to depend on the interaction stances assumed. We apply a Gaussian-Mixture-Model-based (GMM) statistical mapping in combination with a Fisher kernel framework for automatically predicting the Body Language of an interacting participant from the speech and gesture behavior of an interlocutor. The experimental results show that the Fisher kernel-based approach outperforms methods using only the GMM-based mapping, and using the support vector regression, in terms of correlation coefficient and RMSE. These results suggest a significant level of predictability of Body Language behavior from interlocutor cues.

  • Toward Body Language generation in dyadic interaction settings from interlocutor multimodal cues
    2013 IEEE International Conference on Acoustics Speech and Signal Processing, 2013
    Co-Authors: Zhaojun Yang, Angeliki Metallinou, Shrikanth Narayanan
    Abstract:

    During dyadic interactions, participants influence each other's verbal and nonverbal behaviors. In this paper, we examine the coordination between a dyad's Body Language behavior, such as Body motion, posture and relative orientation, given the participants' communication goals, e.g., friendly or conflictive, in improvised interactions. We further describe a Gaussian Mixture Model (GMM) based statistical methodology for automatically generating Body Language of a listener from speech and gesture cues of a speaker. The experimental results show that automatically generated Body Language trajectories generally follow the trends of observed trajectories, especially for velocities of Body and arms, and that the use of speech information improves prediction performance. These results suggest that there is a significant level of predictability of Body Language in the examined goal-driven improvisations, which could be exploited for interaction-driven and goal-driven Body Language generation.

Derek Mccoll - One of the best experts on this subject based on the ideXlab platform.

  • recognizing emotional Body Language displayed by a human like social robot
    International Journal of Social Robotics, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that Body Language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional Body Language for our human-like social robot, Brian 2.0. We develop emotional Body Language for the robot using a variety of Body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic Body Language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional Body Language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human Body Language features displayed by the robot with respect to the same Body Language features displayed by a human actor.

  • Recognizing Emotional Body Language Displayed by a Human-like Social Robot
    International Journal of Social Robotics, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Natural social human-robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that Body Language plays an important role in conveying information about changes in human emotions during human-human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional Body Language for our human-like social robot, Brian 2.0. We develop emotional Body Language for the robot using a variety of Body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic Body Language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot's emotional Body Language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human Body Language features displayed by the robot with respect to the same Body Language features displayed by a human actor. (PsycINFO Database Record (c) 2014 APA, all rights reserved) (journal abstract)

  • Determining the affective Body Language of older adults during socially assistive HRI
    2014 IEEE RSJ International Conference on Intelligent Robots and Systems, 2014
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    Our research focuses on the development of a socially assistive robot to provide cognitive and social stimulation during meal-time scenarios in order to promote proper nutrition amongst the elderly. In this paper, we present the design of a novel automated affect recognition and classification system that will allow the robot to interpret natural displays of affective human Body Language during such one-on-one assistive scenarios. Namely, we identify appropriate Body Language features and learning-based classifiers that can be utilized for accurate affect estimation. A robot can then utilize this information in order to determine its own appropriate responsive behaviors to keep people engaged in this crucial activity. One-on-one assistive meal-time experiments were conducted with the robot Brian 2.1 and elderly participants at a long-term care facility. The results showed the potential of utilizing the automated affect recognition and classification system to identify and classify natural affective Body Language features into valence and arousal values using learning-based classifiers. The elderly users displayed a number of affective states, further motivating the use of the affect estimation system.

  • Affect detection from Body Language during social HRI
    Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, 2012
    Co-Authors: Derek Mccoll, Goldie Nejat
    Abstract:

    In order for robots to effectively engage a person in bi-directional social human-robot interaction (HRI), they need to be able to perceive and respond appropriately to a person's affective state. It has been shown that Body Language is essential in effectively communicating human affect. In this paper, we present an automated real-time Body Language recognition and classification system, utilizing the Microsoft® Kinect™ sensor, that determines a person's affect in terms of their accessibility (i.e., openness and rapport) towards a robot during natural one-on-one interactions. Social HRI experiments are presented with our human-like robot Brian 2.0 and a comparison study between our proposed system and one developed with the Kinect™ Body pose estimation algorithm verifies the performance of our affect classification system in HRI scenarios.

Zhaojun Yang - One of the best experts on this subject based on the ideXlab platform.

  • Analysis and Predictive Modeling of Body Language Behavior in Dyadic Interactions From Multimodal Interlocutor Cues
    IEEE Transactions on Multimedia, 2014
    Co-Authors: Zhaojun Yang, Angeliki Metallinou, Shrikanth Narayanan
    Abstract:

    During dyadic interactions, participants adjust their behavior and give feedback continuously in response to the behavior of their interlocutors and the interaction context. In this paper, we study how a participant in a dyadic interaction adapts his/her Body Language to the behavior of the interlocutor, given the interaction goals and context. We apply a variety of psychology-inspired Body Language features to describe Body motion and posture. We first examine the coordination between the dyad's behavior for two interaction stances: friendly and conflictive. The analysis empirically reveals the dyad's behavior coordination, and helps identify informative interlocutor features with respect to the participant's target Body Language features. The coordination patterns between the dyad's behavior are found to depend on the interaction stances assumed. We apply a Gaussian-Mixture-Model-based (GMM) statistical mapping in combination with a Fisher kernel framework for automatically predicting the Body Language of an interacting participant from the speech and gesture behavior of an interlocutor. The experimental results show that the Fisher kernel-based approach outperforms methods using only the GMM-based mapping, and using the support vector regression, in terms of correlation coefficient and RMSE. These results suggest a significant level of predictability of Body Language behavior from interlocutor cues.

  • Toward Body Language generation in dyadic interaction settings from interlocutor multimodal cues
    2013 IEEE International Conference on Acoustics Speech and Signal Processing, 2013
    Co-Authors: Zhaojun Yang, Angeliki Metallinou, Shrikanth Narayanan
    Abstract:

    During dyadic interactions, participants influence each other's verbal and nonverbal behaviors. In this paper, we examine the coordination between a dyad's Body Language behavior, such as Body motion, posture and relative orientation, given the participants' communication goals, e.g., friendly or conflictive, in improvised interactions. We further describe a Gaussian Mixture Model (GMM) based statistical methodology for automatically generating Body Language of a listener from speech and gesture cues of a speaker. The experimental results show that automatically generated Body Language trajectories generally follow the trends of observed trajectories, especially for velocities of Body and arms, and that the use of speech information improves prediction performance. These results suggest that there is a significant level of predictability of Body Language in the examined goal-driven improvisations, which could be exploited for interaction-driven and goal-driven Body Language generation.