Interaction Task

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 175887 Experts worldwide ranked by ideXlab platform

Sylvie Granon - One of the best experts on this subject based on the ideXlab platform.

  • Effects of water restriction on social behavior and 5-HT neurons density in the dorsal and median raphe nuclei in mice
    Behavioural Brain Research, 2020
    Co-Authors: Houari Boukersi, Anne Nosjean, Alexis Faure, Nemcha Lebaili, Nathalie Samson, Sylvie Granon
    Abstract:

    We explored here the hypothesis that temporary chronic water restriction in mice affects social behavior, via its action on the density of 5-HT neurons in dorsal and median raphe nuclei (DRN and MRN). For that, we submitted adult C57BL/6 J mice to mild and controlled temporary dehydration, i.e., 6 h of water access every 48 h for 15 days. We investigated their social behavior in a social Interaction Task known to allow free and reciprocal social contact. Results showed that temporary dehydration increases significantly time spent in social contact and social dominance. It also expands 5-HT neuron density within both DRN and MRN and the behavioral and neuronal plasticity were positively correlated. Our findings suggest that disturbance in 5-HT neurotransmission caused by temporary dehydration stress unbalances choice processes of animals in social context.

  • Social context increases ultrasonic vocalizations during restraint in adult mice
    Animal Cognition, 2020
    Co-Authors: E. Lefebvre, Sylvie Granon, Frédéric Chauveau
    Abstract:

    Adult mice emit many ultrasonic vocalizations (USVs) during social Interaction Tasks, but only a few studies have yet reported USVs in stressed adult mice. Our aim was to study which experimental conditions favor USV emission during behaviors associated with different emotional states. As USVs likely mediate social communication, we hypothesized that temporary social isolation followed by exposure to a novel social congener would promote USV emission. USVs were recorded in three different behavioral paradigms: restraint, free moving in a new environment, and during a social Interaction Task. We compared USV emission, with or without the presence of a social congener, in animals socially isolated during different periods (0, 6 or 21 days). Social isolation decreased the number of USVs during free moving, whereas it increased during restraint. During the social Interaction Task, animals produced high-frequency USVs (median: 72.6 kHz, 25-75% range: 67.6-78.2 kHz), especially when the social partner was active and social motivation was high. During restraint, presence of a social congener increased the call rate of low-frequency USVs (median: 52.4 kHz, 25-75% range: 44.8-56.5 kHz). USV frequency followed two unimodal distributions that distinguished low-frequency USVs (≤ 60 kHz) mainly emitted during free-moving (90.9% of total USVs) and restraint (93.1%) conditions, from high-frequency USVs (> 60 kHz) mainly emitted during the social Interaction Task (85.1% of total USVs). The present study confirms that USV call rate and frequency depend on behavioral states, and provides evidence that the presence of a congener promotes ultrasonic vocalizations in restrained adult mice.

  • Social behaviors and acoustic vocalizations in different strains of mice
    Behavioural Brain Research, 2017
    Co-Authors: Elsa Pittaras, Arnaud Cressant, Anne Nosjean, Sylvie Granon, Jonathan Chabout, Alexis Faure
    Abstract:

    Proposing a framework for the study of core functions is valuable for understanding how they are altered in multiple mental disorders involving prefrontal dysfunction, for understanding genetic influences and for testing therapeutic compounds. Social and communication disabilities are reported in several major psychiatric disorders, and social communication disorders also can occur independently. Being able to study social communication involving Interactions and associated acoustic vocalizations in animal models is thus important. All rodents display extensive social behaviors, including Interactions and acoustic vocalizations. It is therefore important to pinpoint potential genetic-related strain differences -and similarities- in social behavior and vocalization. One approach is to compare different mouse strains, and this may be useful in choosing which strains may be best suitable in modeling psychiatric disorders where social and communication deficits are core symptoms. We compared social behavior and ultrasonic acoustic vocalization profiles in males of four mouse strains (129S2/Sv, C57BL/6J, DBA/2, and CD-1) using a social Interaction Task that we previously showed to rely on prefrontal network activity. Our social Interaction Task promotes a high level of ultrasonic vocalization with both social and acoustic parameters, and further allows other measures of social behaviors. The duration of social contact, dominance and aggressiveness varied with the mouse strains. Only C57BL/6J mice showed no attacks, with social contact being highly affiliative, whereas others strains emitted aggressive attacks. C57BL/6J mice also exhibited a significantly higher rate of ultrasonic vocalizations (USV), especially during social Interaction.

  • Acute stress in adulthood impoverishes social choices and triggers aggressiveness in preclinical models
    Frontiers in Behavioral Neuroscience, 2015
    Co-Authors: Anne Nosjean, Arnaud Cressant, Frédéric Chauveau, Fabrice De Chaumont, Jean-christophe Olivo-marin, Sylvie Granon
    Abstract:

    Adult C57BL/6J mice are known to exhibit high level of social flexibility while mice lacking the β2 subunit of nicotinic receptors (β2-/- mice) present social rigidity. We asked ourselves what would be the consequences of a restraint acute stress (45 min) on social Interactions in adult mice of both genotypes, hence the contribution of neuronal nicotinic receptors in this process. We therefore dissected social Interaction complexity of stressed and not stressed dyads of mice in a social Interaction Task. We also measured plasma corticosterone levels in our experimental conditions. We showed that a single stress exposure occurring in adulthood reduced and disorganized social Interaction complexity in both C57BL/6J and β2-/- mice. These stress-induced maladaptive social Interactions involved alteration of distinct social categories and strategies in both genotypes, suggesting a dissociable impact of stress depending on the functioning of the cholinergic nicotinic system. In both genotypes, social behaviors under stress were coupled to aggressive reactions with no plasma corticosterone changes. Thus, aggressiveness appeared a general response independent of nicotinic function. We demonstrate here that a single stress exposure occurring in adulthood is sufficient to impoverish social Interactions: stress impaired social flexibility in C57BL/6J mice whereas it reinforced β2-/- mice behavioral rigidity.

  • Non aggressive and adapted social cognition is controlled by the interplay between noradrenergic and nicotinic mechanisms in the prefrontal cortex.
    FASEB Journal, 2013
    Co-Authors: Renata Dos Santos Coura, Arnaud Cressant, Fabrice De Chaumont, Jean-christophe Olivo-marin, Jing Xia, Yann Pelloux, Jeffrey W. Dalley, Sylvie Granon
    Abstract:

    Social animals establish flexible behaviors and integrated decision-making processes to adapt to social environments. Such behaviors are impaired in all major neuropsychiatric disorders and depend on the prefrontal cortex (PFC). We previously showed that nicotinic acetylcholine receptors (nAChRs) and norepinephrine (NE) in the PFC are necessary for mice to show adapted social cognition. Here, we investigated how the cholinergic and NE systems converge within the PFC to modulate social behavior. We used a social Interaction Task (SIT) in C57BL/6 mice and mice lacking β2*nAChRs (β2(-/-) mice), making use of dedicated software to analyze >20 social sequences and pinpoint social decisions. We performed specific PFC NE depletions before SIT and measured monoamines and acetylcholine (ACh) levels in limbic corticostriatal circuitry. After PFC-NE depletion, C57BL/6 mice exhibited impoverished and more rigid social behavior and were 6-fold more aggressive than sham-lesioned animals, whereas β2(-/-) mice showed unimpaired social behavior. Our biochemical measures suggest a critical involvement of DA in SIT. In addition, we show that the balance between basal levels of monoamines and of ACh modulates aggressiveness and this modulation requires functional β2*nAChRs. These findings demonstrate the critical interplay between prefrontal NE and nAChRs for the development of adapted and nonaggressive social cognition.

Andrea Serino - One of the best experts on this subject based on the ideXlab platform.

  • audio visual sensory deprivation degrades visuo tactile peri personal space
    Consciousness and Cognition, 2018
    Co-Authors: Jeanpaul Noel, Isabella Pasqualini, Andrea Serino, Hervé Lissek, Olaf Blanke, Hyeongdong Park, Mark T Wallace
    Abstract:

    Abstract Self-perception is scaffolded upon the integration of multisensory cues on the body, the space surrounding the body (i.e., the peri-personal space; PPS), and from within the body. We asked whether reducing information available from external space would change: PPS, interoceptive accuracy, and self-experience. Twenty participants were exposed to 15 min of audio-visual deprivation and performed: (i) a visuo-tactile Interaction Task measuring their PPS; (ii) a heartbeat perception Task measuring interoceptive accuracy; and (iii) a series of questionnaires related to self-perception and mental illness. These Tasks were carried out in two conditions: while exposed to a standard sensory environment and under a condition of audio-visual deprivation. Results suggest that while PPS becomes ill defined after audio-visual deprivation, interoceptive accuracy is unaltered at a group-level, with some participants improving and some worsening in interoceptive accuracy. Interestingly, correlational individual differences analyses revealed that changes in PPS after audio-visual deprivation were related to interoceptive accuracy and self-reports of “unusual experiences” on an individual subject basis. Taken together, the findings argue for a relationship between the malleability of PPS, interoceptive accuracy, and an inclination toward aberrant ideation often associated with mental illness.

  • Amputation and prosthesis implantation shape body and peripersonal space representations.
    Scientific reports, 2013
    Co-Authors: Elisa Canzoneri, Marilena Marzolla, Gennaro Verni, Amedeo Amoresano, Andrea Serino
    Abstract:

    Little is known about whether and how multimodal representations of the body (BRs) and of the space around the body (Peripersonal Space, PPS) adapt to amputation and prosthesis implantation. In order to investigate this issue, we tested BR in a group of upper limb amputees by means of a tactile distance perception Task and PPS by means of an audio-tactile Interaction Task. Subjects performed the Tasks with stimulation either on the healthy limb or the stump of the amputated limb, while wearing or not wearing their prosthesis. When patients performed the Tasks on the amputated limb, without the prosthesis, the perception of arm length shrank, with a concurrent shift of PPS boundaries towards the stump. Conversely, wearing the prosthesis increased the perceived length of the stump and extended the PPS boundaries so as to include the prosthetic hand, such that the prosthesis partially replaced the missing limb.

  • Amputation and prosthesis implantation shape body and peripersonal space
    2013
    Co-Authors: Marilena Marzolla, Gennaro Verni, Amedeo Amoresano, Andrea Serino
    Abstract:

    Little is known about whether and how multimodal representations of the body (BRs) and of the space around the body (Peripersonal Space, PPS) adapt to amputation and prosthesis implantation. In order to investigate this issue, we tested BR in a group of upper limb amputees by means of a tactile distance perception Task and PPS by means of an audio-tactile Interaction Task. Subjects performed the Tasks with stimulation either on the healthy limb or the stump of the amputated limb, while wearing or not wearing their prosthesis. When patients performed the Tasks on the amputated limb, without the prosthesis, the perception of arm length shrank, with a concurrent shift of PPS boundaries towards the stump. Conversely, wearing the prosthesis increased the perceived length of the stump and extended the PPS boundaries so as to include the prosthetic hand, such that the prosthesis partially replaced the missing limb.

Jeanbernard Martens - One of the best experts on this subject based on the ideXlab platform.

  • tangible user interfaces for 3d clipping plane Interaction with volumetric data a case study
    International Conference on Multimodal Interfaces, 2005
    Co-Authors: Wen Qi, Jeanbernard Martens
    Abstract:

    Visualization via direct volume rendering is a potentially very powerful technique for exploring and interacting with large amounts of scientific data. However, the available two-dimensional (2D) interfaces make three-dimensional (3D) manipulation with such data very difficult. Many usability problems during Interaction in turn discourage the widespread use of volume rendering as a scientific tool. In this paper, we present a more in-depth investigation into one specific interface aspect, i.e., the positioning of a clipping plane within volume-rendered data. More specifically, we propose three different interface prototypes that have been realized with the help of wireless vision-based tracking. These three prototypes combine aspects of 2D graphical user interfaces with 3D tangible Interaction devices. They allow to experience and compare different user interface strategies for performing the clipping plane Interaction Task. They also provide a basis for carrying out user evaluations in the near future.

  • ICMI - Tangible user interfaces for 3D clipping plane Interaction with volumetric data: a case study
    Proceedings of the 7th international conference on Multimodal interfaces - ICMI '05, 2005
    Co-Authors: Wen Qi, Jeanbernard Martens
    Abstract:

    Visualization via direct volume rendering is a potentially very powerful technique for exploring and interacting with large amounts of scientific data. However, the available two-dimensional (2D) interfaces make three-dimensional (3D) manipulation with such data very difficult. Many usability problems during Interaction in turn discourage the widespread use of volume rendering as a scientific tool. In this paper, we present a more in-depth investigation into one specific interface aspect, i.e., the positioning of a clipping plane within volume-rendered data. More specifically, we propose three different interface prototypes that have been realized with the help of wireless vision-based tracking. These three prototypes combine aspects of 2D graphical user interfaces with 3D tangible Interaction devices. They allow to experience and compare different user interface strategies for performing the clipping plane Interaction Task. They also provide a basis for carrying out user evaluations in the near future.

Abderrahmane Kheddar - One of the best experts on this subject based on the ideXlab platform.

  • Quadratic Programming for Multirobot and Task-Space Force Control
    IEEE Transactions on Robotics, 2019
    Co-Authors: Karim Bouyarmane, Kévin Chappellet, Joris Vaillant, Abderrahmane Kheddar
    Abstract:

    We have extended the Task-space multiobjective controllers that write as quadratic programs (QPs) to handle multirobot systems as a single centralized control. The idea is to assemble all the “robots” models and their Interaction Task constraints into a single QP formulation. By multirobot, we mean that whatever entities a given robot will interact with (solid or articulated systems, actuated, partially or not at all, fixed-base or floating-base), we model them as clusters of robots and the controller computes the state of each cluster as an overall system and their Interaction forces in a physically consistent way. By doing this, the Tasks specification simplifies substantially. At the heart of the Interactions between the systems are the contact forces; methodologies are provided to achieve reliable force tracking by our multirobot QP controller. The approach is assessed by a large panel of experiments on real complex robotic platforms (full-size humanoid, dexterous robotic hand, fixed-base anthropomorphic arm) performing whole-body manipulations, dexterous manipulations, and robot-robot comanipulations of rigid floating objects and articulated mechanisms, such as doors, drawers, boxes, or even smaller mechanisms like a spring-loaded click pen.

  • Multi-robot and Task-space force control with quadratic programming
    2017
    Co-Authors: Karim Bouyarmane, Kévin Chappellet, Joris Vaillant, Abderrahmane Kheddar
    Abstract:

    We extend the Task-space multi-objective controllers that write as quadratic programs (QP) to handle multi-robot systems as a single centralized control. The idea is to assemble all the 'robots' models and their Interaction Task constraints into a single QP formulation. By multi-robot we mean that whatever entities a given robot will interact with (solid or articulated systems, actuated or not or partially, fixed-base or floating-base), we model them as robots and the controller computes the state of the overall system and their Interaction forces in a physically consistent way. By doing so, the Tasks specification simplifies substantially. At the heart of the Interactions between the systems is the contact forces: we provide methodologies to achieve reliable force tracking with our multi-robot QP controller. The approach is assessed with a large panel of experiments on real complex robotic platforms (full-size humanoid, dexterous robotic hand, fixed-base anthropomorphic arm), performing whole-body manipulation, dexterous manipulation and robot-robot co-manipulation of rigid floating objects and articulated mechanisms such as doors, drawers, boxes, or even smaller mechanisms such as a spring-loaded click pen. The implementation code of the controller is made available in open source .

Wen Qi - One of the best experts on this subject based on the ideXlab platform.

  • tangible user interfaces for 3d clipping plane Interaction with volumetric data a case study
    International Conference on Multimodal Interfaces, 2005
    Co-Authors: Wen Qi, Jeanbernard Martens
    Abstract:

    Visualization via direct volume rendering is a potentially very powerful technique for exploring and interacting with large amounts of scientific data. However, the available two-dimensional (2D) interfaces make three-dimensional (3D) manipulation with such data very difficult. Many usability problems during Interaction in turn discourage the widespread use of volume rendering as a scientific tool. In this paper, we present a more in-depth investigation into one specific interface aspect, i.e., the positioning of a clipping plane within volume-rendered data. More specifically, we propose three different interface prototypes that have been realized with the help of wireless vision-based tracking. These three prototypes combine aspects of 2D graphical user interfaces with 3D tangible Interaction devices. They allow to experience and compare different user interface strategies for performing the clipping plane Interaction Task. They also provide a basis for carrying out user evaluations in the near future.

  • ICMI - Tangible user interfaces for 3D clipping plane Interaction with volumetric data: a case study
    Proceedings of the 7th international conference on Multimodal interfaces - ICMI '05, 2005
    Co-Authors: Wen Qi, Jeanbernard Martens
    Abstract:

    Visualization via direct volume rendering is a potentially very powerful technique for exploring and interacting with large amounts of scientific data. However, the available two-dimensional (2D) interfaces make three-dimensional (3D) manipulation with such data very difficult. Many usability problems during Interaction in turn discourage the widespread use of volume rendering as a scientific tool. In this paper, we present a more in-depth investigation into one specific interface aspect, i.e., the positioning of a clipping plane within volume-rendered data. More specifically, we propose three different interface prototypes that have been realized with the help of wireless vision-based tracking. These three prototypes combine aspects of 2D graphical user interfaces with 3D tangible Interaction devices. They allow to experience and compare different user interface strategies for performing the clipping plane Interaction Task. They also provide a basis for carrying out user evaluations in the near future.