Journal Article PreJuSER-11869

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Incongruence effects in crossmodal emotional integration

 ;  ;  ;  ;  ;  ;

2011
Academic Press Orlando, Fla.

NeuroImage 54, 2257 - 2266 () [10.1016/j.neuroimage.2010.10.047]

This record in other databases:    

Please use a persistent id in citations: doi:

Abstract: Emotions are often encountered in a multimodal fashion. Consequently, contextual framing by other modalities can alter the way that an emotional facial expression is perceived and lead to emotional conflict. Whole brain fMRI data was collected when 35 healthy subjects judged emotional expressions in faces while concurrently being exposed to emotional (scream, laughter) or neutral (yawning) sounds. The behavioral results showed that subjects rated fearful and neutral faces as being more fearful when accompanied by screams than compared to yawns (and laughs for fearful faces). Moreover, the imaging data revealed that incongruence of emotional valence between faces and sounds led to increased activation in the middle cingulate cortex, right superior frontal cortex, right supplementary motor area as well as the right temporoparietal junction. Against expectations no incongruence effects could be found in the amygdala. Further analyses revealed that, independent of emotional valence congruency, the left amygdala was consistently activated when the information from both modalities was emotional. If a neutral stimulus was present in one modality and emotional in the other, activation in the left amygdala was significantly attenuated. These results indicate that incongruence of emotional valence in audiovisual integration activates a cingulate-fronto-parietal network involved in conflict monitoring and resolution. Furthermore in audiovisual pairing amygdala responses seem to signal also the absence of any neutral feature rather than only the presence of an emotionally charged one.

Keyword(s): Acoustic Stimulation (MeSH) ; Adult (MeSH) ; Amygdala: physiology (MeSH) ; Cerebral Cortex: physiology (MeSH) ; Data Interpretation, Statistical (MeSH) ; Depression: psychology (MeSH) ; Emotions: physiology (MeSH) ; Facial Expression (MeSH) ; Female (MeSH) ; Humans (MeSH) ; Image Processing, Computer-Assisted (MeSH) ; Laughter (MeSH) ; Linear Models (MeSH) ; Magnetic Resonance Imaging (MeSH) ; Male (MeSH) ; Photic Stimulation (MeSH) ; Prefrontal Cortex: physiology (MeSH) ; Psychiatric Status Rating Scales (MeSH) ; Social Perception (MeSH) ; Yawning (MeSH) ; J ; fMRI (auto) ; Emotional conflict (auto) ; Incongruence (auto) ; Amygdala (auto) ; Audiovisual (auto)


Note: This study was supported by the Deutsche Forschungsgemeinschaft (DFG, IRTG 1328), by the Human Brain Project (R01-MH074457-01A1), and the Helmholtz Initiative on systems biology (The Human Brain Model).

Contributing Institute(s):
  1. Molekulare Organisation des Gehirns (INM-2)
Research Program(s):
  1. Funktion und Dysfunktion des Nervensystems (FUEK409) (FUEK409)
  2. 89571 - Connectivity and Activity (POF2-89571) (POF2-89571)

Appears in the scientific report 2011
Click to display QR Code for this record

The record appears in these collections:
Document types > Articles > Journal Article
Institute Collections > INM > INM-2
Workflow collections > Public records
Publications database

 Record created 2012-11-13, last modified 2021-01-29



Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)