Browsing by Author "Berman, Jared M. J."
Now showing 1 - 7 of 7
Results Per Page
Sort Options
Item Open Access Contextual influences on children's use of vocal affect cues during referential interpretation(Routledge, 2012-09) Berman, Jared M. J.; Graham, Susan; Chambers, Craig G.In three experiments, we investigated 5-year-olds' sensitivity to speaker vocal affect during referential interpretation in cases where the indeterminacy is or is not resolved by speech information. In Experiment 1, analyses of eye gaze patterns and pointing behaviours indicated that 5-year-olds used vocal affect cues at the point where an ambiguous description was encountered. In Experiments 2 and 3, we used unambiguous situations to investigate how the referential context influences the ability to use affect cues earlier in the utterance. Here, we found a differential use of speaker vocal affect whereby 5-year-olds' referential hypotheses were influenced by negative vocal affect cues in advance of the noun, but not by positive affect cues. Together, our findings reveal how 5-year-olds use a speaker's vocal affect to identify potential referents in different contextual situations and also suggest that children may be more attuned to negative vocal affect than positive vocal affect, particularly early in an utterance.Item Open Access ELIA: a software application for integrating spoken language and eye movements(Springer, 2013-01) Berman, Jared M. J.; Khu, Melanie; Graham, Ian; Graham, SusanWe have developed a new software application, Eye-gaze Language Integration Analysis (ELIA), which allows for the rapid integration of gaze data with spoken language input (either live or prerecorded). Specifically, ELIA integrates E-Prime output and/or .csv files that include eye-gaze and real-time language information. The process of combining eye movements with real-time speech often involves multiple error-prone steps (e.g., cleaning, transposing, graphing) before a simple time course analysis plot can be viewed or before data can be imported into a statistical package. Some of the advantages of this freely available software include (1) reducing the amount of time spent preparing raw eye-tracking data for analysis; (2) allowing for the quick analysis of pilot data in order to identify issues with experimental design; (3) facilitating the separation of trial types, which allows for the examination of supplementary effects (e.g., order or gender effects); and (4) producing standard output files (i.e., .csv files) that can be read by numerous spreadsheet packages and transferred to any statistical software.Item Open Access The object of my desire: Five-year-olds rapidly reason about a speaker's desire during referential communication(Elsevier : Journal of Experimental Child Psychology, 2017-01) San Juan, Valerie; Chambers, Craig G.; Berman, Jared M. J.; Humphry, Chelsea; Graham, SusanTwo experiments examined whether 5-year-olds draw inferences about desire outcomes that constrain their online interpretation of an utterance. Children were informed of a speaker's positive (Experiment 1) or negative (Experiment 2) desire to receive a specific toy as a gift before hearing a referentially ambiguous statement ("That's my present") spoken with either a happy or sad voice. After hearing the speaker express a positive desire, children (N=24) showed an implicit (i.e., eye gaze) and explicit ability to predict reference to the desired object when the speaker sounded happy, but they showed only implicit consideration of the alternate object when the speaker sounded sad. After hearing the speaker express a negative desire, children (N=24) used only happy prosodic cues to predict the intended referent of the statement. Taken together, the findings indicate that the efficiency with which 5-year-olds integrate desire reasoning with language processing depends on the emotional valence of the speaker's voice but not on the type of desire representations (i.e., positive vs. negative) that children must reason about online.Item Open Access Preschoolers use emotion in speech to learn new words(Society for Research In Child Development, 2013-02) Callaway, Dallas; Chambers, Craig G; Berman, Jared M. J.; Graham, SusanTwo experiments examined 4- and 5-year-olds' use of vocal affect to learn new words. In Experiment 1 (n = 48), children were presented with two unfamiliar objects, first in their original state and then in an altered state (broken or enhanced). An instruction produced with negative, neutral, or positive affect, directed children to find the referent of a novel word. During the novel noun, eye gaze measures indicated that both 4- and 5-year-olds were more likely to consider an object congruent with vocal affect cues. In Experiment 2, 5-year-olds (n = 15) were asked to extend and generalize their initial mapping to new exemplars. Here, 5-year-olds generalized these newly-mapped labels but only when presented with negative vocal affect.Item Open Access Preschoolers' appreciation of speaker vocal affect as a cue to referential intent(Elsevier : Journal of Experimental Child Psychology, 2010-06) Berman, Jared M. J.; Chambers, Craig G.; Graham, SusanAn eye-tracking methodology was used to evaluate 3- and 4-year-old children's sensitivity to speaker affect when resolving referential ambiguity. Children were presented with pictures of three objects on a screen (including two referents of the same kind, e.g., an intact doll and a broken doll, and one distracter item), paired with a prerecorded referentially ambiguous instruction (e.g., "Look at the doll"). The intonation of the instruction varied in terms of the speaker's vocal affect: positive-sounding, negative-sounding, or neutral. Analyses of eye gaze patterns indicated that 4-year-olds, but not 3-year-olds, were more likely to look to the referent whose state matched the speaker's vocal affect as the noun was heard (e.g., looked more often to the broken doll referent in the negative affect condition). These findings indicate that 4-year-olds can use speaker affect to help identify referential mappings during on-line comprehension.Item Open Access Preschoolers' extension of novel words to animals and artifacts(Cambridge University Press : Journal of Child Language, 2009-10) Graham, Susan; Welder, Andrea N.; Merrifield, Beverley A.; Berman, Jared M. J.We examined whether preschoolers' ontological knowledge would influence lexical extension. In Experiment 1, four-year-olds were presented with a novel label for either an object with eyes described as an animal, or the same object without eyes described as a tool. In the animal condition, children extended the label to similar-shaped objects, whereas in the tool condition, children extended the label to similar-function objects. In Experiment 2, when four-year-olds were presented with objects with eyes described as tools, they extended the label on the basis of shared function. These experiments suggest that preschoolers' conceptual knowledge guides their lexical extension.Item Open Access Preschoolers' real-time coordination of vocal and facial emotional information(Journal of Experimental Child Psychology, 2016-10) Berman, Jared M. J.; Chambers, Craig G.; Graham, SusanAn eye-tracking methodology was used to examine the time course of 3- and 5-year-olds' ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and 5-year-olds, sad-sounding speech triggered gaze shifts to a matching (sad-looking) face from the earliest moments of speech processing. However, it was not until approximately 800ms into a happy-sounding utterance that preschoolers began to use the emotional cues from speech to identify a matching (happy-looking) face. Complementary analyses based on conscious/controlled behaviors (children's explicit points toward the faces) indicated that 5-year-olds, but not 3-year-olds, could successfully match happy-sounding and sad-sounding vocal affect to a corresponding emotional face. Together, the findings clarify developmental patterns in preschoolers' implicit versus explicit ability to coordinate emotional cues across modalities and highlight preschoolers' greater sensitivity to sad-sounding speech as the auditory signal unfolds in time.