000011869 001__ 11869
000011869 005__ 20210129210548.0
000011869 0247_ $$2pmid$$apmid:20974266
000011869 0247_ $$2DOI$$a10.1016/j.neuroimage.2010.10.047
000011869 0247_ $$2WOS$$aWOS:000286302000048
000011869 037__ $$aPreJuSER-11869
000011869 041__ $$aeng
000011869 082__ $$a610
000011869 084__ $$2WoS$$aNeurosciences
000011869 084__ $$2WoS$$aNeuroimaging
000011869 084__ $$2WoS$$aRadiology, Nuclear Medicine & Medical Imaging
000011869 1001_ $$0P:(DE-Juel1)131699$$aMüller, V.I.$$b0$$uFZJ
000011869 245__ $$aIncongruence effects in crossmodal emotional integration
000011869 260__ $$aOrlando, Fla.$$bAcademic Press$$c2011
000011869 300__ $$a2257 - 2266
000011869 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article
000011869 3367_ $$2DataCite$$aOutput Types/Journal article
000011869 3367_ $$00$$2EndNote$$aJournal Article
000011869 3367_ $$2BibTeX$$aARTICLE
000011869 3367_ $$2ORCID$$aJOURNAL_ARTICLE
000011869 3367_ $$2DRIVER$$aarticle
000011869 440_0 $$04545$$aNeuroImage$$v54$$x1053-8119$$y3
000011869 500__ $$aThis study was supported by the Deutsche Forschungsgemeinschaft (DFG, IRTG 1328), by the Human Brain Project (R01-MH074457-01A1), and the Helmholtz Initiative on systems biology (The Human Brain Model).
000011869 520__ $$aEmotions are often encountered in a multimodal fashion. Consequently, contextual framing by other modalities can alter the way that an emotional facial expression is perceived and lead to emotional conflict. Whole brain fMRI data was collected when 35 healthy subjects judged emotional expressions in faces while concurrently being exposed to emotional (scream, laughter) or neutral (yawning) sounds. The behavioral results showed that subjects rated fearful and neutral faces as being more fearful when accompanied by screams than compared to yawns (and laughs for fearful faces). Moreover, the imaging data revealed that incongruence of emotional valence between faces and sounds led to increased activation in the middle cingulate cortex, right superior frontal cortex, right supplementary motor area as well as the right temporoparietal junction. Against expectations no incongruence effects could be found in the amygdala. Further analyses revealed that, independent of emotional valence congruency, the left amygdala was consistently activated when the information from both modalities was emotional. If a neutral stimulus was present in one modality and emotional in the other, activation in the left amygdala was significantly attenuated. These results indicate that incongruence of emotional valence in audiovisual integration activates a cingulate-fronto-parietal network involved in conflict monitoring and resolution. Furthermore in audiovisual pairing amygdala responses seem to signal also the absence of any neutral feature rather than only the presence of an emotionally charged one.
000011869 536__ $$0G:(DE-Juel1)FUEK409$$2G:(DE-HGF)$$aFunktion und Dysfunktion des Nervensystems (FUEK409)$$cFUEK409$$x0
000011869 536__ $$0G:(DE-HGF)POF2-89571$$a89571 - Connectivity and Activity (POF2-89571)$$cPOF2-89571$$fPOF II T$$x1
000011869 588__ $$aDataset connected to Web of Science, Pubmed
000011869 65320 $$2Author$$afMRI
000011869 65320 $$2Author$$aEmotional conflict
000011869 65320 $$2Author$$aIncongruence
000011869 65320 $$2Author$$aAmygdala
000011869 65320 $$2Author$$aAudiovisual
000011869 650_2 $$2MeSH$$aAcoustic Stimulation
000011869 650_2 $$2MeSH$$aAdult
000011869 650_2 $$2MeSH$$aAmygdala: physiology
000011869 650_2 $$2MeSH$$aCerebral Cortex: physiology
000011869 650_2 $$2MeSH$$aData Interpretation, Statistical
000011869 650_2 $$2MeSH$$aDepression: psychology
000011869 650_2 $$2MeSH$$aEmotions: physiology
000011869 650_2 $$2MeSH$$aFacial Expression
000011869 650_2 $$2MeSH$$aFemale
000011869 650_2 $$2MeSH$$aHumans
000011869 650_2 $$2MeSH$$aImage Processing, Computer-Assisted
000011869 650_2 $$2MeSH$$aLaughter
000011869 650_2 $$2MeSH$$aLinear Models
000011869 650_2 $$2MeSH$$aMagnetic Resonance Imaging
000011869 650_2 $$2MeSH$$aMale
000011869 650_2 $$2MeSH$$aPhotic Stimulation
000011869 650_2 $$2MeSH$$aPrefrontal Cortex: physiology
000011869 650_2 $$2MeSH$$aPsychiatric Status Rating Scales
000011869 650_2 $$2MeSH$$aSocial Perception
000011869 650_2 $$2MeSH$$aYawning
000011869 650_7 $$2WoSType$$aJ
000011869 7001_ $$0P:(DE-HGF)0$$aHabel, U.$$b1
000011869 7001_ $$0P:(DE-HGF)0$$aDerntl, B.$$b2
000011869 7001_ $$0P:(DE-HGF)0$$aSchneider, F.$$b3
000011869 7001_ $$0P:(DE-Juel1)131714$$aZilles, K.$$b4$$uFZJ
000011869 7001_ $$0P:(DE-HGF)0$$aTuretsky, B.I.$$b5
000011869 7001_ $$0P:(DE-Juel1)131678$$aEickhoff, S. B.$$b6$$uFZJ
000011869 773__ $$0PERI:(DE-600)1471418-8$$a10.1016/j.neuroimage.2010.10.047$$gVol. 54, p. 2257 - 2266$$p2257 - 2266$$q54<2257 - 2266$$tNeuroImage$$v54$$x1053-8119$$y2011
000011869 8567_ $$uhttp://dx.doi.org/10.1016/j.neuroimage.2010.10.047
000011869 909CO $$ooai:juser.fz-juelich.de:11869$$pVDB
000011869 915__ $$0StatID:(DE-HGF)0010$$aJCR/ISI refereed
000011869 9141_ $$y2011
000011869 9132_ $$0G:(DE-HGF)POF3-571$$1G:(DE-HGF)POF3-570$$2G:(DE-HGF)POF3-500$$aDE-HGF$$bKey Technologies$$lDecoding the Human Brain$$vConnectivity and Activity$$x0
000011869 9131_ $$0G:(DE-HGF)POF2-89571$$1G:(DE-HGF)POF3-890$$2G:(DE-HGF)POF3-800$$3G:(DE-HGF)POF3$$4G:(DE-HGF)POF$$aDE-HGF$$bProgrammungebundene Forschung$$lohne Programm$$vConnectivity and Activity$$x1
000011869 9201_ $$0I:(DE-Juel1)INM-2-20090406$$gINM$$kINM-2$$lMolekulare Organisation des Gehirns$$x0
000011869 970__ $$aVDB:(DE-Juel1)123242
000011869 980__ $$aVDB
000011869 980__ $$aConvertedRecord
000011869 980__ $$ajournal
000011869 980__ $$aI:(DE-Juel1)INM-2-20090406
000011869 980__ $$aUNRESTRICTED