Preschoolers' real-time coordination of vocal and facial emotional information

dc.contributor.authorBerman, Jared M. J.
dc.contributor.authorChambers, Craig G.
dc.contributor.authorGraham, Susan
dc.date.accessioned2020-04-20T23:01:24Z
dc.date.available2020-04-20T23:01:24Z
dc.date.issued2016-10
dc.description.abstractAn eye-tracking methodology was used to examine the time course of 3- and 5-year-olds' ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and 5-year-olds, sad-sounding speech triggered gaze shifts to a matching (sad-looking) face from the earliest moments of speech processing. However, it was not until approximately 800ms into a happy-sounding utterance that preschoolers began to use the emotional cues from speech to identify a matching (happy-looking) face. Complementary analyses based on conscious/controlled behaviors (children's explicit points toward the faces) indicated that 5-year-olds, but not 3-year-olds, could successfully match happy-sounding and sad-sounding vocal affect to a corresponding emotional face. Together, the findings clarify developmental patterns in preschoolers' implicit versus explicit ability to coordinate emotional cues across modalities and highlight preschoolers' greater sensitivity to sad-sounding speech as the auditory signal unfolds in time.en_US
dc.description.grantingagencySocial Sciences and Humanities Research Council (SSHRC)en_US
dc.description.grantingagencyAlberta Innovates - Research Granten_US
dc.identifier.citationBerman, J. M. J., Chambers, C. G., & Graham, S. A. (2016). Preschoolers' real-time coordination of vocal and facial emotional information. "Journal of Experimental Child Psychology", 142 (2016), 391-399. http://dx.doi.org/10.1016/j.jecp.2015.09.014en_US
dc.identifier.doihttp://dx.doi.org/10.1016/j.jecp.2015.09.014en_US
dc.identifier.urihttp://hdl.handle.net/1880/111820
dc.identifier.urihttps://doi.org/10.11575/PRISM/43623
dc.language.isoengen_US
dc.publisherJournal of Experimental Child Psychologyen_US
dc.publisher.departmentPsychologyen_US
dc.publisher.facultyArtsen_US
dc.publisher.hasversionpublishedVersionen_US
dc.publisher.institutionUniversity of Calgaryen_US
dc.publisher.institutionUniversity of Torontoen_US
dc.rightsUnless otherwise indicated, this material is protected by copyright and has been made available with authorization from the copyright owner. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.en_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0en_US
dc.titlePreschoolers' real-time coordination of vocal and facial emotional informationen_US
dc.typejournal articleen_US
ucalgary.item.requestcopytrueen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Berman, J.M.J., Chambers, C. G., & Graham, S.A., (2016) Journal of Experimental Child Psychology.pdf
Size:
429.59 KB
Format:
Adobe Portable Document Format
Description:
Main article
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.92 KB
Format:
Item-specific license agreed upon to submission
Description: