Emotion Expression

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Klaus R. R. Scherer - One of the best experts on this subject based on the ideXlab platform.

  • path models of vocal Emotion communication
    PLOS ONE, 2015
    Co-Authors: Tanja Banziger, Georg Hosoya, Klaus R. R. Scherer
    Abstract:

    We propose to use a comprehensive path model of vocal Emotion communication, encompassing encoding, transmission, and decoding processes, to empirically model data sets on Emotion Expression and recognition. The utility of the approach is demonstrated for two data sets from two different cultures and languages, based on corpora of vocal Emotion enactment by professional actors and Emotion inference by naive listeners. Lens model equations, hierarchical regression, and multivariate path analysis are used to compare the relative contributions of objectively measured acoustic cues in the enacted Expressions and subjective voice cues as perceived by listeners to the variance in Emotion inference from vocal Expressions for four Emotion families (fear, anger, happiness, and sadness). While the results confirm the central role of arousal in vocal Emotion communication, the utility of applying an extended path modeling framework is demonstrated by the identification of unique combinations of distal cues and proximal percepts carrying information about specific Emotion families, independent of arousal. The statistical models generated show that more sophisticated acoustic parameters need to be developed to explain the distal underpinnings of subjective voice quality percepts that account for much of the variance in Emotion inference, in particular voice instability and roughness. The general approach advocated here, as well as the specific results, open up new research strategies for work in psychology (specifically Emotion and social perception research) and engineering and computer science (specifically research and development in the domain of affective computing, particularly on automatic Emotion detection and synthetic Emotion Expression in avatars).

  • the body action and posture coding system bap development and reliability
    Journal of Nonverbal Behavior, 2012
    Co-Authors: Nele Dael, Marcello Mortillaro, Klaus R. R. Scherer
    Abstract:

    Several methods are available for coding body movement in nonverbal behavior research, but there is no consensus on a reliable coding system that can be used for the study of Emotion Expression. Adopting an integrative approach, we developed a new method, the body action and posture coding system, for the time-aligned micro description of body movement on an anatomical level (different articulations of body parts), a form level (direction and orientation of movement), and a functional level (communicative and self-regulatory functions). We applied the system to a new corpus of acted Emotion portrayals, examined its comprehensiveness and demonstrated intercoder reliability at three levels: (a) occurrence, (b) temporal precision, and (c) segmentation. We discuss issues for further validation and propose some research applications.

  • Emotion Expression in body action and posture
    Emotion, 2012
    Co-Authors: Nele Dael, Marcello Mortillaro, Klaus R. R. Scherer
    Abstract:

    Emotion communication research strongly focuses on the face and voice as expressive modalities, leaving the rest of the body relatively understudied. Contrary to the early assumption that body movement only indicates Emotional intensity, recent studies have shown that body movement and posture also conveys Emotion specific information. However, a deeper understanding of the underlying mechanisms is hampered by a lack of production studies informed by a theoretical framework. In this research we adopted the Body Action and Posture (BAP) coding system to examine the types and patterns of body movement that are employed by 10 professional actors to portray a set of 12 Emotions. We investigated to what extent these Expression patterns support explicit or implicit predictions from basic Emotion theory, bidimensional theory, and componential appraisal theory. The overall results showed partial support for the different theoretical approaches. They revealed that several patterns of body movement systematically occur in portrayals of specific Emotions, allowing Emotion differentiation. Although a few Emotions were prototypically expressed by one particular pattern, most Emotions were variably expressed by multiple patterns, many of which can be explained as reflecting functional components of Emotion such as modes of appraisal and action readiness. It is concluded that further work in this largely underdeveloped area should be guided by an appropriate theoretical framework to allow a more systematic design of experiments and clear hypothesis testing.

  • introducing the geneva multimodal Expression corpus for experimental research on Emotion perception
    Emotion, 2012
    Co-Authors: Tanja Banziger, Marcello Mortillaro, Klaus R. R. Scherer
    Abstract:

    Research on the perception of Emotional Expressions in faces and voices is exploding in psychology, the neurosciences, and affective computing. This article provides an overview of some of the major Emotion Expression (EE) corpora currently available for empirical research and introduces a new, dynamic, multimodal corpus of Emotion Expressions, the Geneva Multimodal Emotion Portrayals Core Set (GEMEP-CS). The design features of the corpus are outlined and justified, and detailed validation data for the core set selection are presented and discussed. Finally, an associated database with microcoded facial, vocal, and body action elements, as well as observer ratings, is introduced.

  • beyond arousal valence and potency control cues in the vocal Expression of Emotion
    Journal of the Acoustical Society of America, 2010
    Co-Authors: Martijn Goudbeek, Klaus R. R. Scherer
    Abstract:

    The important role of arousal in determining vocal parameters in the Expression of Emotion is well established. There is less evidence for the contribution of Emotion dimensions such as valence and potency/control to vocal Emotion Expression. Here, an acoustic analysis of the newly developed Geneva Multimodal Emotional Portrayals corpus, is presented to examine the role of dimensions other than arousal. This corpus contains twelve Emotions that systematically vary with respect to valence, arousal, and potency/control. The Emotions were portrayed by professional actors coached by a stage director. The extracted acoustic parameters were first compared with those obtained from a similar corpus [Banse and Scherer (1996). J. Pers. Soc. Psychol. 70, 614–636] and shown to largely replicate the earlier findings. Based on a principal component analysis, seven composite scores were calculated and were used to determine the relative contribution of the respective vocal parameters to the Emotional dimensions arousal,...

Tara M Chaplin - One of the best experts on this subject based on the ideXlab platform.

  • gender differences in Emotion Expression in low income adolescents under stress
    Journal of Nonverbal Behavior, 2016
    Co-Authors: Naaila Panjwani, Tara M Chaplin, Rajita Sinha, Linda C Mayes
    Abstract:

    Gender roles in mainstream US culture suggest that girls express more happiness, sadness, anxiety, and shame/embarrassment than boys, while boys express more anger and externalizing Emotions, such as contempt. However, gender roles and Emotion Expression may be different in low-income and ethnically diverse families, as children and parents are often faced with greater environmental stressors and may have different gender expectations. This study examined gender differences in Emotion Expression in low-income adolescents, an understudied population. One hundred and seventy nine adolescents (aged 14–17) participated in the Trier Social Stress Test (TSST). Trained coders rated adolescents’ Expressions of happiness, sadness, anxiety, shame/embarrassment, anger, and contempt during the TSST using a micro-analytic coding system. Analyses showed that, consistent with gender roles, girls expressed higher levels of happiness and shame than boys; however, contrary to traditional gender roles, girls showed higher levels of contempt than boys. Also, in contrast to cultural stereotypes, there were no differences in anger between boys and girls. Findings suggest gender-role inconsistent displays of externalizing Emotions in low-income adolescents under acute stress, and may reflect different Emotion socialization experiences in this group.

  • gender and Emotion Expression a developmental contextual perspective
    Emotion Review, 2015
    Co-Authors: Tara M Chaplin
    Abstract:

    Small but significant gender differences in Emotion Expressions have been reported for adults, with women showing greater Emotional expressivity, especially for positive Emotions and internalizing negative Emotions such as sadness. But when, developmentally, do these gender differences emerge? And what developmental and contextual factors influence their emergence? This article describes a developmental bio-psycho-social model of gender differences in Emotion Expression in childhood. Prior empirical research supporting the model, at least with mostly White middle-class U.S. samples of youth, is presented. Limitations to the extant literature and future directions for research on gender and child Emotion are suggested.

  • gender differences in Emotion Expression in children a meta analytic review
    Psychological Bulletin, 2013
    Co-Authors: Tara M Chaplin, Amelia Aldao
    Abstract:

    Emotion Expression is an important feature of healthy child development that has been found to show gender differences. However, there has been no empirical review of the literature on gender and facial, vocal, and behavioral Expressions of different types of Emotions in children. The present study constitutes a comprehensive meta-analytic review of gender differences and moderators of differences in Emotion Expression from infancy through adolescence. We analyzed 555 effect sizes from 166 studies with a total of 21,709 participants. Significant but very small gender differences were found overall, with girls showing more positive Emotions (g –.08) and internalizing Emotions (e.g., sadness, anxiety, sympathy; g –.10) than boys, and boys showing more externalizing Emotions (e.g., anger; g .09) than girls. Notably, gender differences were moderated by age, interpersonal context, and task valence, underscoring the importance of contextual factors in gender differences. Gender differences in positive Emotions were more pronounced with increasing age, with girls showing more positive Emotions than boys in middle childhood (g –.20) and adolescence (g –.28). Boys showed more externalizing Emotions than girls at toddler/preschool age (g .17) and middle childhood (g .13) and fewer externalizing Emotions than girls in adolescence (g –.27). Gender differences were less pronounced with parents and were more pronounced with unfamiliar adults (for positive Emotions) and with peers/when alone (for externalizing Emotions). Our findings of gender differences in Emotion Expression in specific contexts have important implications for gender differences in children’s healthy and maladaptive development.

  • gender differences in caregiver Emotion socialization of low income toddlers
    New Directions for Child and Adolescent Development, 2010
    Co-Authors: Tara M Chaplin, Rajita Sinha, James Casey, Linda C Mayes
    Abstract:

    Studies have shown gender differences in children’s Emotion Expression as early as preschool age, with girls showing greater sadness and anxiety/fear than boys and boys showing greater anger/aggression than girls, at least for middle-class children (Brody, 1999; Cole, 1986). These patterns of Expression are consistent with gender roles in U.S. culture for females to be relationship-oriented and to show “softer” negative Emotions and for males to be assertive and to more freely show anger (Brody & Hall, 2000; Jordan, Surrey, & Kaplan, 1991; Zahn-Waxler, Cole, & Barrett, 1991). But how do girls and boys come to internalize gender roles and to express different patterns of Emotion? Emotional arousal and Emotion Expression have a basis in biology (Fox, 1994). However, boys’ and girls’ Emotions may also be influenced by messages from their environment, including from caregivers (also referred to throughout as “parents”). As discussed in Chapter One of this volume, previous studies of parental socialization of Emotion have shown gender differences, with girls receiving greater supportive responses for their sadness and anxiety and boys receiving greater support for their anger (e.g., Chaplin, Cole, & Zahn-Waxler, 2005; Fivush, 1989). Notably, these studies have examined Emotion socialization processes mainly in Caucasian, middle-income families. The present chapter will discuss gender and Emotion socialization in low-income families. It is important to understand Emotion socialization in these families, given that they encounter multiple chronic stressors that impact child Emotion and parent–child interactions. We will also describe potential consequences of gender differences in parental Emotion socialization for children (and, in particular, low-income children): gendered socialization may lead boys and girls to adopt different patterns of Emotion that may, in their extremes, contribute to risk for different types of psychopathology (Izard, 1972; Malatesta & Wilson, 1988). In this chapter we focus on caregivers’ responses to their children’s Emotions in low-income families, although “Emotion socialization” also includes other aspects of family life, such as parents’ own expressivity (Eisenberg, Cumberland, & Spinrad, 1998; Thompson & Meyer, 2007). Also, we focus on child gender differences, although differences between mothers and fathers in their socialization practices have also been found (see Kennedy Root and Denham, Chapter One) and are important to consider in low-income families.

  • parental socialization of Emotion Expression gender differences and relations to child adjustment
    Emotion, 2005
    Co-Authors: Tara M Chaplin, Pamela M Cole, Carolyn Zahnwaxler
    Abstract:

    The present study examined gender differences in children's submissive and disharmonious Emotions and parental attention to these Emotions. Sixty children and their mothers and fathers participated when children were 4 and 6 years old. Children's Emotion Expression and parental responses during a game were coded. Girls expressed more submissive Emotion than boys. Fathers attended more to girls' submissive Emotion than to boys' at preschool age. Fathers attended more to boys' disharmonious Emotion than to girls' at early school age. Parental attention at preschool age predicted later submissive Expression level. Child disharmonious Emotion predicted later externalizing symptoms. Gender differences in these Emotions may occur as early as preschool age and may be subject to differential responding, particularly by fathers.

Richard J Porter - One of the best experts on this subject based on the ideXlab platform.

  • processing of facial Emotion Expression in major depression a review
    Australian and New Zealand Journal of Psychiatry, 2010
    Co-Authors: Cecilia Bourke, Katie M Douglas, Richard J Porter
    Abstract:

    Processing of facial Expressions of Emotion is central to human interaction, and has important effects on behaviour and affective state. A range of methods and paradigms have been used to investigate various aspects of abnormal processing of facial Expressions in major depression, including Emotion specific deficits in recognition accuracy, response biases and attentional biases. The aim of this review is to examine and interpret data from studies of facial Emotion processing in major depression, in the context of current knowledge about the neural correlates of facial Expression processing of primary Emotions. The review also discusses the methodologies used to examine facial Expression processing. Studies of facial Emotion processing and facial Emotion recognition were identified up to December 2009 utilizing MEDLINE and Web of Science. Although methodological variations complicate interpretation of findings, there is reasonably consistent evidence of a negative response bias towards sadness in individuals with major depression, so that positive (happy), neutral or ambiguous facial Expressions tend to be evaluated as more sad or less happy compared with healthy control groups. There is also evidence of increased vigilance and selective attention towards sad Expressions and away from happy Expressions, but less evidence of reduced general or Emotion-specific recognition accuracy. Data is complicated by the use of multiple paradigms and the heterogeneity of major depression. Future studies should address methodological problems, including variations in patient characteristics, testing paradigms and procedures, and statistical methods used to analyse findings.

  • processing of facial Emotion Expression in major depression a review
    Australian and New Zealand Journal of Psychiatry, 2010
    Co-Authors: Cecilia Bourke, Katie M Douglas, Richard J Porter
    Abstract:

    Processing of facial Expressions of Emotion is central to human interaction, and has important effects on behaviour and affective state. A range of methods and paradigms have been used to investigate various aspects of abnormal processing of facial Expressions in major depression, including Emotion specific deficits in recognition accuracy, response biases and attentional biases. The aim of this review is to examine and interpret data from studies of facial Emotion processing in major depression, in the context of current knowledge about the neural correlates of facial Expression processing of primary Emotions. The review also discusses the methodologies used to examine facial Expression processing. Studies of facial Emotion processing and facial Emotion recognition were identified up to December 2009 utilizing MEDLINE and Web of Science.Although methodological variations complicate interpretation of findings, there is reasonably consistent evidence of a negative response bias towards sadness in individua...

Nobutsuna Endo - One of the best experts on this subject based on the ideXlab platform.

  • impression survey of the Emotion Expression humanoid robot with mental model based dynamic Emotions
    International Conference on Robotics and Automation, 2013
    Co-Authors: Tatsuhiro Kishi, Nobutsuna Endo, T Kojima, Matthieu Destephe, Takuya Otani, Lorenzo Jamone, Przemyslaw Kryczka, Gabriele Trovato, Kenji Hashimoto, Sarah Cosentino
    Abstract:

    This paper describes the implementation in a walking humanoid robot of a mental model, allowing the dynamical change of the Emotional state of the robot based on external stimuli; the Emotional state affects the robot decisions and behavior, and it is expressed with both facial and whole-body patterns. The mental model is applied to KOBIAN-R, a 65-DoFs whole body humanoid robot designed for human-robot interaction and Emotion Expression. To evaluate the importance of the proposed system in the framework of human-robot interaction and communication, we conducted a survey by showing videos of the robot behaviors to a group of 30 subjects. The results show that the integration of dynamical Emotion Expression and locomotion makes the humanoid robot more appealing to humans, as it is perceived as more “favorable” and “useful”, and less “robot-like".

  • integration of Emotion Expression and visual tracking locomotion based on vestibulo ocular reflex
    Robot and Human Interactive Communication, 2010
    Co-Authors: Nobutsuna Endo, Kenji Hashimoto, Keita Endo, Takuya Kojima, Fumiya Iida, Atsuo Takanishi
    Abstract:

    Personal robots anticipated to become popular in the future are required to be active in joint work and community life with humans. These personal robots must recognize changing environment and must conduct adequate actions like human. Visual tracking can be said as a fundamental function from the view point of environmental sensing and reflex reaction against it. The authors developed a visual tracking motion algorithm by using upper body. Then, we integrated it with an online walking pattern generator and developed a visual tracking biped locomotion. Finally, we conducted an experimental evaluation with Emotion Expression.

  • Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body Emotion Expression capabilities-
    2008 8th IEEE-RAS International Conference on Humanoid Robots Humanoids 2008, 2008
    Co-Authors: Massimiliano Zecca, Nobutsuna Endo, Kazuko Itoh, Shimpei Momoki, Atsuo Takanishi
    Abstract:

    Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. In particular, these robots are expected to be fundamental for helping and assisting elderly and disabled people during their activities of daily living (ADLs). To achieve this result, personal robots should be capable of human-like Emotion Expressions; in addition, human-like bipedal walking is the best solution for the robots which should be active in the human living environment. Although several bipedal robots and several Emotional Expression robots have been developed in the recent years, until now there was no robot which integrated all these functions. Therefore we developed a new bipedal walking robot, named KOBIAN, which is also capable to express human-like Emotions. In this paper, we present the design and the preliminary evaluation of the new Emotional Expression head. The preliminary results showed that the Emotion expressed by only the head cannot be really easily understood by the users. However, the presence of a full body clearly enhances the Emotion Expression capability of the robot, thus proving the effectiveness of the proposed approach.

  • Development of whole-body Emotion Expression humanoid robot
    Proceedings - IEEE International Conference on Robotics and Automation, 2008
    Co-Authors: Nobutsuna Endo, Kazuko Itoh, Yu Mizoguchi, Shimpei Momoki, Massimiliano Zecca, Minoru Saito, Atsuo Takanishim
    Abstract:

    Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans, as partner and as friends for us. The authors think that the Emotion Expression of a robot is effective in joint activities of human and robot. In addition, we also think that bipedal walking is necessary to robots which are active in human living environment. But, there was no robot which has those functions. And, it is not clear what kinds of functions are effective actually. Therefore we developed a new bipedal walking robot which is capable to express Emotions. In this paper, we present the design and the preliminary evaluation of the new head of the robot with only a small number of degrees of freedom for facial Expression.

Linda C Mayes - One of the best experts on this subject based on the ideXlab platform.

  • gender differences in Emotion Expression in low income adolescents under stress
    Journal of Nonverbal Behavior, 2016
    Co-Authors: Naaila Panjwani, Tara M Chaplin, Rajita Sinha, Linda C Mayes
    Abstract:

    Gender roles in mainstream US culture suggest that girls express more happiness, sadness, anxiety, and shame/embarrassment than boys, while boys express more anger and externalizing Emotions, such as contempt. However, gender roles and Emotion Expression may be different in low-income and ethnically diverse families, as children and parents are often faced with greater environmental stressors and may have different gender expectations. This study examined gender differences in Emotion Expression in low-income adolescents, an understudied population. One hundred and seventy nine adolescents (aged 14–17) participated in the Trier Social Stress Test (TSST). Trained coders rated adolescents’ Expressions of happiness, sadness, anxiety, shame/embarrassment, anger, and contempt during the TSST using a micro-analytic coding system. Analyses showed that, consistent with gender roles, girls expressed higher levels of happiness and shame than boys; however, contrary to traditional gender roles, girls showed higher levels of contempt than boys. Also, in contrast to cultural stereotypes, there were no differences in anger between boys and girls. Findings suggest gender-role inconsistent displays of externalizing Emotions in low-income adolescents under acute stress, and may reflect different Emotion socialization experiences in this group.

  • gender differences in caregiver Emotion socialization of low income toddlers
    New Directions for Child and Adolescent Development, 2010
    Co-Authors: Tara M Chaplin, Rajita Sinha, James Casey, Linda C Mayes
    Abstract:

    Studies have shown gender differences in children’s Emotion Expression as early as preschool age, with girls showing greater sadness and anxiety/fear than boys and boys showing greater anger/aggression than girls, at least for middle-class children (Brody, 1999; Cole, 1986). These patterns of Expression are consistent with gender roles in U.S. culture for females to be relationship-oriented and to show “softer” negative Emotions and for males to be assertive and to more freely show anger (Brody & Hall, 2000; Jordan, Surrey, & Kaplan, 1991; Zahn-Waxler, Cole, & Barrett, 1991). But how do girls and boys come to internalize gender roles and to express different patterns of Emotion? Emotional arousal and Emotion Expression have a basis in biology (Fox, 1994). However, boys’ and girls’ Emotions may also be influenced by messages from their environment, including from caregivers (also referred to throughout as “parents”). As discussed in Chapter One of this volume, previous studies of parental socialization of Emotion have shown gender differences, with girls receiving greater supportive responses for their sadness and anxiety and boys receiving greater support for their anger (e.g., Chaplin, Cole, & Zahn-Waxler, 2005; Fivush, 1989). Notably, these studies have examined Emotion socialization processes mainly in Caucasian, middle-income families. The present chapter will discuss gender and Emotion socialization in low-income families. It is important to understand Emotion socialization in these families, given that they encounter multiple chronic stressors that impact child Emotion and parent–child interactions. We will also describe potential consequences of gender differences in parental Emotion socialization for children (and, in particular, low-income children): gendered socialization may lead boys and girls to adopt different patterns of Emotion that may, in their extremes, contribute to risk for different types of psychopathology (Izard, 1972; Malatesta & Wilson, 1988). In this chapter we focus on caregivers’ responses to their children’s Emotions in low-income families, although “Emotion socialization” also includes other aspects of family life, such as parents’ own expressivity (Eisenberg, Cumberland, & Spinrad, 1998; Thompson & Meyer, 2007). Also, we focus on child gender differences, although differences between mothers and fathers in their socialization practices have also been found (see Kennedy Root and Denham, Chapter One) and are important to consider in low-income families.