001037610 001__ 1037610
001037610 005__ 20250203124505.0
001037610 0247_ $$2doi$$a10.3389/fcomm.2024.1483135
001037610 0247_ $$2datacite_doi$$a10.34734/FZJ-2025-00781
001037610 0247_ $$2WOS$$aWOS:001360545700001
001037610 037__ $$aFZJ-2025-00781
001037610 082__ $$a380
001037610 1001_ $$0P:(DE-HGF)0$$aZimmermann, Juliane T.$$b0
001037610 245__ $$aLookers and listeners on the autism spectrum: the roles of gaze duration and pitch height in inferring mental states
001037610 260__ $$aLausanne$$bFrontiers Media$$c2024
001037610 3367_ $$2DRIVER$$aarticle
001037610 3367_ $$2DataCite$$aOutput Types/Journal article
001037610 3367_ $$0PUB:(DE-HGF)16$$2PUB:(DE-HGF)$$aJournal Article$$bjournal$$mjournal$$s1737445642_5302
001037610 3367_ $$2BibTeX$$aARTICLE
001037610 3367_ $$2ORCID$$aJOURNAL_ARTICLE
001037610 3367_ $$00$$2EndNote$$aJournal Article
001037610 520__ $$aAlthough mentalizing abilities in autistic adults without intelligence deficits are similar to those of control participants in tasks relying on verbal information, they are dissimilar in tasks relying on non-verbal information. The current study aims to investigate mentalizing behavior in autism in a paradigm involving two important nonverbal means to communicate mental states: eye gaze and speech intonation. In an eye-tracking experiment, participants with ASD and a control group watched videos showing a virtual character gazing at objects while an utterance was presented auditorily. We varied the virtual character’s gaze duration toward the object (600 or 1800 ms) and the height of the pitch peak on the accented syllable of the word denoting the object. Pitch height on the accented syllable was varied by 45 Hz, leading to high or low prosodic emphasis. Participants were asked to rate the importance of the given object for the virtual character. At the end of the experiment, we assessed how well participants recognized the objects they were presented with in a recognition task. Both longer gaze duration and higher pitch height increased the importance ratings of the object for the virtual character overall. Compared to the control group, ratings of the autistic group were lower for short gaze, but higher when gaze was long but pitch was low. Regardless of an ASD diagnosis, participants clustered into three behaviorally different subgroups, representing individuals whose ratings were influenced (1) predominantly by gaze duration, (2) predominantly by pitch height, or (3) by neither, accordingly labelled “Lookers,” “Listeners” and “Neithers” in our study. “Lookers” spent more time fixating the virtual character’s eye region than “Listeners,” while both “Listeners” and “Neithers” spent more time fixating the object than “Lookers.” Object recognition was independent of the virtual character’s gaze duration towards the object and pitch height. It was also independent of an ASD diagnosis. Our results show that gaze duration and intonation are effectively used by autistic persons for inferring the importance of an object for a virtual character. Notably, compared to the control group, autistic participants were influenced more strongly by gaze duration than by pitch height.
001037610 536__ $$0G:(DE-HGF)POF4-5251$$a5251 - Multilevel Brain Organization and Variability (POF4-525)$$cPOF4-525$$fPOF IV$$x0
001037610 536__ $$0G:(GEPRIS)281511265$$aDFG project 281511265 - SFB 1252: Prominenz in Sprache (281511265)$$c281511265$$x1
001037610 588__ $$aDataset connected to CrossRef, Journals: juser.fz-juelich.de
001037610 7001_ $$0P:(DE-HGF)0$$aEllison, T. Mark$$b1
001037610 7001_ $$0P:(DE-HGF)0$$aCangemi, Francesco$$b2
001037610 7001_ $$0P:(DE-HGF)0$$aWehrle, Simon$$b3
001037610 7001_ $$0P:(DE-Juel1)176404$$aVogeley, Kai$$b4$$ufzj
001037610 7001_ $$0P:(DE-HGF)0$$aGrice, Martine$$b5
001037610 773__ $$0PERI:(DE-600)2856337-2$$a10.3389/fcomm.2024.1483135$$gVol. 9, p. 1483135$$p1483135$$tFrontiers in communication$$v9$$x2297-900X$$y2024
001037610 8564_ $$uhttps://juser.fz-juelich.de/record/1037610/files/PDF.pdf$$yOpenAccess
001037610 909CO $$ooai:juser.fz-juelich.de:1037610$$pdnbdelivery$$pdriver$$pVDB$$popen_access$$popenaire
001037610 9101_ $$0I:(DE-588b)5008462-8$$6P:(DE-Juel1)176404$$aForschungszentrum Jülich$$b4$$kFZJ
001037610 9131_ $$0G:(DE-HGF)POF4-525$$1G:(DE-HGF)POF4-520$$2G:(DE-HGF)POF4-500$$3G:(DE-HGF)POF4$$4G:(DE-HGF)POF$$9G:(DE-HGF)POF4-5251$$aDE-HGF$$bKey Technologies$$lNatural, Artificial and Cognitive Information Processing$$vDecoding Brain Organization and Dysfunction$$x0
001037610 9141_ $$y2024
001037610 915__ $$0StatID:(DE-HGF)0200$$2StatID$$aDBCoverage$$bSCOPUS$$d2024-12-21
001037610 915__ $$0LIC:(DE-HGF)CCBY4$$2HGFVOC$$aCreative Commons Attribution CC BY 4.0
001037610 915__ $$0StatID:(DE-HGF)0112$$2StatID$$aWoS$$bEmerging Sources Citation Index$$d2024-12-21
001037610 915__ $$0StatID:(DE-HGF)0501$$2StatID$$aDBCoverage$$bDOAJ Seal$$d2023-12-04T13:10:06Z
001037610 915__ $$0StatID:(DE-HGF)0500$$2StatID$$aDBCoverage$$bDOAJ$$d2023-12-04T13:10:06Z
001037610 915__ $$0StatID:(DE-HGF)0700$$2StatID$$aFees$$d2024-12-21
001037610 915__ $$0StatID:(DE-HGF)0150$$2StatID$$aDBCoverage$$bWeb of Science Core Collection$$d2024-12-21
001037610 915__ $$0StatID:(DE-HGF)0510$$2StatID$$aOpenAccess
001037610 915__ $$0StatID:(DE-HGF)0030$$2StatID$$aPeer Review$$bDOAJ : Anonymous peer review$$d2023-12-04T13:10:06Z
001037610 915__ $$0StatID:(DE-HGF)0561$$2StatID$$aArticle Processing Charges$$d2024-12-21
001037610 915__ $$0StatID:(DE-HGF)0300$$2StatID$$aDBCoverage$$bMedline$$d2024-12-21
001037610 915__ $$0StatID:(DE-HGF)0199$$2StatID$$aDBCoverage$$bClarivate Analytics Master Journal List$$d2024-12-21
001037610 920__ $$lyes
001037610 9201_ $$0I:(DE-Juel1)INM-3-20090406$$kINM-3$$lKognitive Neurowissenschaften$$x0
001037610 980__ $$ajournal
001037610 980__ $$aVDB
001037610 980__ $$aUNRESTRICTED
001037610 980__ $$aI:(DE-Juel1)INM-3-20090406
001037610 9801_ $$aFullTexts