Brain Plasticity: Evidence from children with prenatal brain injury. (Reilly, J. Levine, S., Nass, R. and Stiles, J., 2008)
Co-speech gestures influence neural responses in brain regions associated with semantic processing. (Dick, A. S., Goldin-Meadow, S., Hasson, U., Skipper, J. I., & Small, S. L., 2009)
Do parents lead their children by the hand? (Ozcaliskan, S. & Goldin-Meadow, S., 2005)
Does language about similarity play a role in fostering similarity comparison in children? (Ozcaliskan, S., Goldin-Meadow, S., Gentner, D., & Mylander, C., 2009)
Does linguistic input play the same role in language learning for children with and without early brain injury? (Rowe, M.L., Levine, S. C., Fisher, J., & Goldin-Meadow, S., 2009)
Differences in early gesture explain SES disparities in child vocabulary size at school entry. (Rowe, M.L., & Goldin-Meadow, S., 2009)
Early gesture predicts language delay in children with pre- or perinatal brain lesions. (Sauer, E., Levine, S.C., Rowe, M. & Goldin-Meadow, S., 2010)
Early gesture selectively predicts later language learning. (Rowe, M.L., & Goldin-Meadow, S., 2009)
Emergence of syntax: Commonalities and differences across children. (Vasilyeva, M., Waterfall, H. & Huttenlocher, J., 2008)
Exploring test-retest reliability in fMRI of language: Group and task effects. (Chen, E. E., & Small, S. L., 2007)
Gesture is at the cutting edge of early language development. (Ozcaliskan, S. & Goldin-Meadow, S. , 2005)
Gestures orchestrate brain networks for language understanding. (Skipper, J. I., Goldin-Meadow, S., Nusbaum, H. C., & Small, S. L., 2009)
Sources of variability in children’s language growth. (Huttenlocher, J., Waterfall, H., Vasilyeva, M., Vevea, J., & Hedges, L., in press)
Is there an iconic gesture spurt at 26 months? (Ozcaliskan, S., & Goldin-Meadow, S., in press)
Language input and child syntax. (Huttenlocher, J., Vasilyeva, M., Cymerman, E., & Levine, S., 2002)
Learning words by hand: Gesture’s role in predicting vocabulary development. (Rowe, M. L., Ozcaliskan, S., & Goldin-Meadow, S. , 2008)
Narrative skill in children with early unilateral brain injury: a limit to functional plasticity. (Demir, E., Levine, S.C., & Goldin-Meadow, S., 2010)
Neural development of networks for audiovisual speech comprehension. (Dick, A. S., Solodkin, A., & Small, S. L., 2010)
Parental goals and talk to children. (Rowe, M. L., & Casillas, A., in press)
Sex differences in language first appear in gesture. (Ozcaliskan, S., & Goldin-Meadow, S., 2010)
When gesture-speech combinations do and do not index linguistic change. (Ozcaliskan, S. & Goldin-Meadow, S., 2009)
Young children use their hands to tell their mothers what to say. (Goldin-Meadow, S., Goodrich, W., Sauer, E., & Iverson, J. , 2007)
Reilly, J. Levine, S., Nass, R. and Stiles, J., 2008, “Brain Plasticity: Evidence from children with prenatal brain injury”, Child Neuropsychology, Eds. J. Reed and J. Warner. Oxford: Blackwell Publishing, 2008.
Abstract and PDF are currently unavailable for this book chapter.
Dick, A. S., Goldin-Meadow, S., Hasson, U., Skipper, J. I., & Small, S. L., 2009, “Co-speech gestures influence neural responses in brain regions associated with semantic processing”, Human Brain Mapping, 30.11 (2009), 3509-3526. (PMID: 19384890). PDF
Everyday communication is accompanied by visual information from several sources, including co-speech gestures, which provide semantic information listeners use to help disambiguate the speaker’s message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.
Ozcaliskan, S. & Goldin-Meadow, S. , 2005, “Do parents lead their children by the hand?”, Journal of Child Language, 32.3 (2005), 481-505. PDF
The types of gesture+speech combinations children produce during the early stages of language development change over time. This change, in turn, predicts the onset of two-word speech and thus might reflect a cognitive transition that the child is undergoing. An alternative, however, is that the change merely reflects changes in the types of gesture+speech combinations that their caregivers produce. To explore this possibility, we videotaped 40 american child–caregiver dyads in their homes for 90 minutes when the children were 1;2, 1;6, and 1;10. Each gesture was classified according to type (deictic, conventional, representational) and the relation it held to speech (reinforcing, disambiguating, supplementary). Children and their caregivers produced the same types of gestures and in approximately the same distribution. However, the children differed from their caregivers in the way they used gesture in relation to speech. Over time, children produced many more reinforcing (bike+point at bike), disambiguating (that one+point at bike), and supplementary combinations (ride+point at bike). In contrast, the frequency and distribution of caregivers’ gesture+speech combinations remained constant over time. Thus, the changing relation between gesture and speech observed in the children cannot be traced back to the gestural input the children receive. Rather, it appears to reflect changes in the children’s own skills, illustrating once again gesture’s ability to shed light on developing cognitive and linguistic processes.
Ozcaliskan, S., Goldin-Meadow, S., Gentner, D., & Mylander, C., 2009, “Does language about similarity play a role in fostering similarity comparison in children?”, Cognition, 112.2 (2009), 217-228. (PMID: 19524220). PDF
Commenting on perceptual similarities between objects stands out as an important linguistic achievement, one that may pave the way towards noticing and commenting on more abstract relational commonalities between objects. To explore whether having a conventional linguistic system is necessary for children to comment on different types of similarity comparisons, we observed four children who had not been exposed to usable linguistic input – deaf children whose hearing losses prevented them from learning spoken language and whose hearing parents had not exposed them to sign language. These children developed gesture systems that have language-like structure at many different levels. Here we ask whether the deaf children used their gestures to comment on similarity relations and, if so, which types of relations they expressed. We found that all four deaf children were able to use their gestures to express similarity comparisons (point to cat + point to tiger) resembling those conveyed by 40 hearing children in early gesture + speech combinations (cat + point to tiger). However, the two groups diverged at later ages. Hearing children, after acquiring the word like, shifted from primarily expressing global similarity (as in cat/tiger) to primarily expressing single-property similarity (as in crayon is brown like my hair). In contrast, the deaf children, lacking an explicit term for similarity, continued to primarily express global similarity. The findings underscore the robustness of similarity comparisons in human communication, but also highlight the importance of conventional terms for comparison as likely contributors to routinely expressing more focused similarity relations.
Rowe, M.L., Levine, S. C., Fisher, J., & Goldin-Meadow, S., 2009, “Does linguistic input play the same role in language learning for children with and without early brain injury?”, Developmental Psychology, 45.1 (2009), 90-100. (NIHMS 59552). PDF
Children with unilateral pre- or perinatal brain injury (BI) show remarkable plasticity for language learning. Previous work highlights the important role that lesion characteristics play in explaining individual variation in plasticity in the language development of children with BI. The current study examines whether the linguistic input that children with BI receive from their caregivers also contributes to this early plasticity, and whether linguistic input plays a similar role in children with BI as it does in typically developing (TD) children. Growth in vocabulary and syntactic production is modeled for 80 children (53 TD, 27 BI) between 14 and 46 months. Findings indicate that caregiver input is an equally potent predictor of syntactic growth for children with BI than for TD children. Controlling for input, lesion characteristics (lesion size, type, seizure history) also affect the language trajectories of children with BI. Thus, findings illustrate how both variability in the environment (linguistic input) and variability in the organism (lesion characteristics) work together to contribute to plasticity in language learning.
Rowe, M.L., & Goldin-Meadow, S., 2009, “Differences in early gesture explain SES disparities in child vocabulary size at school entry”, Science, 2009, 323, 951-953. (NIHMS 89518). PDF
Children from low-socioeconomic status (SES) families, on average, arrive at school with smaller vocabularies than children from high-SES families. In an effort to identify precursors to, and possible remedies for, this inequality, we videotaped 50 children from families with a range of different SES interacting with parents at 14 months and assessed their vocabulary skills at 54 months. We found that children from high-SES families frequently used gesture to communicate at 14 months, a relation that was explained by parent gesture use (with speech controlled). In turn, the fact that children from high-SES families have large vocabularies at 54 months was explained by the fact that children from high-SES families have large vocabularies at 54 months was explained by children’s gesture use at 14 months. Thus, differences in early gesture help to explain the disparities in vocabulary that children bring with them to school.
Sauer, E., Levine, S.C., Rowe, M. & Goldin-Meadow, S., 2010, “Early gesture predicts language delay in children with pre- or perinatal brain lesions”, Child Development, 81 (2010) 528-539. (NIHMS 174716). PDF
Does early gesture use predict later productive and receptive vocabulary in children with pre- or perinatal unilateral brain lesions (PL)? Eleven Children with PL were categorized into 2 groups based on whether their gesture at 18 months was within or below the range of typically developing (TD) children. Children with PL whose gesture was within the TD range developed a productive vocabulary at 22 and 26 months and a receptive vocabulary at 30 months that were all within the TD range. In contrast, children with PL below the TD range did not. Gesture was thus an early marker of which children with early unilateral lesions would eventually experience language delay, suggesting that gesture is a promising diagnostic tool for persistent delay.
Rowe, M.L., & Goldin-Meadow, S., 2009, “Early gesture selectively predicts later language learning”, Developmental Science, 12.1 (2009), 182-187. (PMID 19120426). PDF
The gestures children produce predict the early stages of spoken language development. Here we ask whether gesture is a global predictor of language learning, or whether particular gestures predict particular language outcomes. We observed 52 children interacting with their caregivers at home, and found that gesture use at 18 months selectively predicted lexical versus syntactic skills at 42 months, even with early child speech controlled. Specifically, number of different meanings conveyed in gesture at 18 months predicted vocabulary at 42 months, but number of gesture+speech combinations did not. In contrast, number of gesture+speech combinations, particularly those conveying sentence-like ideas, produced at 18 months predicted sentence complexity at 42 months, but meanings conveyed in gesture did not. We can thus predict particular milestones in vocabulary and sentence complexity at age 3 1/2 by watching how children move their hands two years earlier.
Vasilyeva, M., Waterfall, H. & Huttenlocher, J., 2008, “Emergence of syntax: Commonalities and differences across children”, Developmental Science, 11.1 (2008), 84-97. PDF
This paper presents the results of a longitudinal examination of syntactic skills, starting at the age of emergence of simple sentences and continuing through the emergence of complex sentences. We ask whether there is systematic variability among children from different socioeconomic backgrounds in the early stages of sentence production. The results suggest a different answer for simple versus complex sentences. We found a striking similarity across SES groups on the measures tapping early mastery of basic syntactic rules of simple sentences. At the same time, there was a significant difference between SES groups in the mastery of complex sentence structures. This difference emerged at the earliest stages of production of multi-clause sentences and persisted throughout the period of observation. The implications of these findings for the understanding of mechanisms of syntactic development are discussed.
Chen, E. E., & Small, S. L., 2007, “Exploring test-retest reliability in fMRI of language: Group and task effects”, Brain and Language, 102 (2007), 176-185. (NIHMS 91756).
Abstract and PDF are currently unavailable for this article.
Ozcaliskan, S. & Goldin-Meadow, S. , 2005, “Gesture is at the cutting edge of early language development”, Cognition, 96.3 (2005), B101-113. PDF
Children who produce one word at a time often use gesture to supplement their speech, turning a single word into an utterance that conveys a sentence-like meaning (‘eat’+ point at cookie). Interestingly, the age at which children first produce supplementary gesture-speech combinations of this sort reliably predicts the age at which they first produce two-word utterances. Gesture thus serves as a signal that a child will soon be ready to begin producing multi-word sentences. The question is what happens next. Gesture could continue to expand a child’s communicative repertoire over development, combining with words to convey increasingly complex ideas. Alternatively, after serving as an opening wedge into language, gesture could cease its role as a forerunner of linguistic change. We addressed this question in a sample of 40 typically developing children, each observed at 14, 18, and 22 months. The number of supplementary gesture-speech combinations the children produced increased significantly from 14 to 22 months. More importantly, the types of supplementary combinations the children produced changed over time and presaged changes in their speech. Children produced three distinct constructions across the two modalities several months before these same constructions appeared entirely within speech. Gesture thus continues to be at the cutting edge of early language development, providing stepping-stones to increasingly complex linguistic constructions.
Skipper, J. I., Goldin-Meadow, S., Nusbaum, H. C., & Small, S. L., 2009, “Gestures orchestrate brain networks for language understanding”, Current Biology, 19.8 (2009), 661-667. (PMID 19327997). PDF
Although the linguistic structure of speech provides valuable communicative information, nonverbal behaviors can offer additional, often disambiguating cues. In particular, being able to see the face and hand movements of a speaker facilitates language comprehension. But how does the brain derive meaningful information from these movements? Mouth movements provide information about phonological aspects of speech. In contrast, cospeech gestures display semantic information relevant to the intended message. We show that when language comprehension is accompanied by observable face movements, there is strong functional connectivity between areas of cortex involved in motor planning and production and posterior areas thought to mediate phonological aspects of speech perception. In contrast, language comprehension accompanied by cospeech gestures is associated with tuning of and strong functional connectivity between motor planning and production areas and anterior areas thought to mediate semantic aspects of language comprehension. These areas are not tuned to hand and arm movements that are not meaningful. Results suggest that when gestures accompany speech, the motor system works with language comprehension areas to determine the meaning of those gestures. Results also suggest that the cortical networks underlying language comprehension, rather than being fixed, are dynamically organized by the type of contextual information available to listeners during face-to-face communication.
Huttenlocher, J., Waterfall, H., Vasilyeva, M., Vevea, J., & Hedges, L., in press, “Sources of variability in children’s language growth”, Cognitive Psychology, in press. (NIHMS 174787).
The present longitudinal study examines the role of caregiver speech in language development, especially syntactic development, using 47 parent.child pairs of diverse SES background from 14 to 46 months. We assess the diversity (variety) of words and syntactic structures produced by caregivers and children. We use lagged correlations to examine language growth and its relation to caregiver speech. Results show substantial individual differences among children, and indicate that diversity of earlier caregiver speech significantly predicts corresponding diversity in later child speech. For vocabulary, earlier child speech also predicts later caregiver speech, suggesting mutual influence. However, for syntax, earlier child speech does not significantly predict later caregiver speech, suggesting a causal flow from caregiver to child. Finally, demographic factors, notably SES, are related to language growth, and are, at least partially, mediated by differences in caregiver speech, showing the pervasive influence of caregiver speech on language growth.
Ozcaliskan, S., & Goldin-Meadow, S., in press, “Is there an iconic gesture spurt at 26 months?”, Integrating gestures: The interdisciplinary nature of gesture, Eds. G. Stam & M. Ishino. Amsterdam, NL: John Benjamins, in press.
Abstract and PDF are currently unavailable for this article.
Huttenlocher, J., Vasilyeva, M., Cymerman, E., & Levine, S., 2002, “Language input and child syntax”, Cognitive Psychology, 45.3 (2002), 337-374. PDF
Existing work on the acquisition of syntax has been concerned mainly with the early stages of syntactic development. In the present study we examine later syntactic development in children. Also, existing work has focused on commonalities in the emergence of syntax. Here we explore individual differences among children and their relation to variations in language input. In Study 1 we find substantial individual differences in childrenÕs mastery of multiclause sentences and a significant rela-tion between those differences and the proportion of multiclause sentences in parent speech. We also find individual differences in the number of noun phrases in children’s utterances and a significant relation between those differences and the number of noun phrases in parent speech. In Study 2 we find greater syntactic growth over a year of preschool in classes where teachers’ speech is more syntactically complex. The implications of our findings for the understanding of the sources of syntactic development are discussed.
Rowe, M. L., Ozcaliskan, S., & Goldin-Meadow, S. , 2008, “Learning words by hand: Gesture’s role in predicting vocabulary development”, First Language, 28.2 (2008), 182-199. PDF
Children vary widely in how quickly their vocabularies grow. Can looking at early gesture use in children and parents help us predict this variability? We videotaped 53 English-speaking parent-child dyads in their homes during their daily activities for 90-minutes every four months between child age 14 and 34 months. At 42 months, children were given the Peabody Picture Vocabulary Test (PPVT). We found that child gesture use at 14 months was a significant predictor of vocabulary size at 42 months, above and beyond the effects of parent and child word use at 14 months. Parent gesture use at 14 months was not directly related to vocabulary development, but did relate to child gesture use at 14 months which, in turn, predicted child vocabulary. These relations hold even when background factors such as socio-economic status are controlled. The findings underscore the importance of examining early gesture when predicting child vocabulary development.
Demir, E., Levine, S.C., & Goldin-Meadow, S., 2010, “Narrative skill in children with early unilateral brain injury: a limit to functional plasticity”, Developmental Science, 2010, 13:4, 636-647. PDF
Children with pre- or perinatal brain injury (PL) exhibit marked plasticity for language learning. Previous work has focused mostly on the emergence of earlier-developing skills, such as vocabulary and syntax. Here we ask whether this plasticity for earlier-developing aspects of language extends to more complex, later-developing language functions by examining the narrative production of children with PL. Using an elicitation technique that involves asking children to create stories de novo in response to a story stem, we collected narratives from 11 children with PL and 20 typically developing (TD) children. Narratives were analysed for length, diversity of the vocabulary used, use of complex syntax, complexity of the macro-level narrative structure and use of narrative evaluation. Children’s language performance on vocabulary and syntax tasks outside the narrative context was also measured. Findings show that children with PL produced shorter stories, used less diverse vocabulary, produced structurally less complex stories at the macro-level, and made fewer inferences regarding the cognitive states of the story characters. These differences in the narrative task emerged even though children with PL did not differ from TD children on vocabulary and syntax tasks outside the narrative context. Thus, findings suggest that there may be limitations to the plasticity for language functions displayed by children with PL, and that these limitations may be most apparent in complex, decontextualized language tasks such as narrative production.
Dick, A. S., Solodkin, A., & Small, S. L., 2010, “Neural development of networks for audiovisual speech comprehension”, Brain and Language, 2010, 114:2, 101-114. (PMID: 19781755). PDF
Everyday conversation is both an auditory and a visual phenomenon. While visual speech information enhances comprehension for the listener, evidence suggests that the ability to benefit from this information improves with development. A number of brain regions have been implicated in audiovisual speech comprehension, but the extent to which the neurobiological substrate in the child compares to the adult is unknown. In particular, developmental differences in the network for audiovisual speech comprehension could manifest through the incorporation of additional brain regions, or through different patterns of effective connectivity. In the present study we used functional magnetic resonance imaging and structural equation modeling (SEM) to characterize the developmental changes in network interactions for audiovisual speech comprehension. The brain response was recorded while children 8- to 11-years-old and adults passively listened to stories under audiovisual (AV) and auditory-only (A) conditions. Results showed that in children and adults, AV comprehension activated the same fronto-temporo-parietal network of regions known for their contribution to speech production and perception. However, the SEM network analysis revealed age-related differences in the functional interactions among these regions. In particular, the influence of the posterior inferior frontal gyrus/ventral premotor cortex on supramarginal gyrus differed across age groups during AV, but not A speech. This functional pathway might be important for relating motor and sensory information used by the listener to identify speech sounds. Further, its development might reflect changes in the mechanisms that relate visual speech information to articulatory speech representations through experience producing and perceiving speech.
Rowe, M. L., & Casillas, A., in press, “Parental goals and talk to children”, Infant and Child Development, in press. PDF
Myriad studies support a relation between parental beliefs and behaviors. This study adds to the literature by focusing on the specific relationship between parental goals and their communication with toddlers. Do parents with different goals talk about different topics with their children? Parents’ goals for their 30-month-olds were gathered using semi-structured interviews with 47 primary caregivers, whereas the topics of conversations that took place during interactions were investigated via coding videotapes of observations in the home. Parents’ short- and long-term goals spanned several areas including educational, social-emotional, developmental and pragmatic goals. Parental utterances most frequently focused on pragmatic issues, followed by play and academic topics. Parents who mentioned long-term educational goals devoted more of their talk to academic topics and less to pragmatic topics, controlling for socio-economic status. Thus, parental goals differ and these differences relate to the conversations parents engage in with their children.
Ozcaliskan, S., & Goldin-Meadow, S., 2010, “Sex differences in language first appear in gesture”, Developmental Science, 2010, 13:5, 752-760. (NIHMS 174739). PDF
Children differ in how quickly they reach linguistic milestones. Boys typically produce their first multi-word sentences later than girls do. We ask here whether there are sex differences in children’s gestures that precede, and presage, these sex differences in speech. To explore this question, we observed 22 girls and 18 boys every 4 months as they progressed from one-word speech to multi-word speech. We found that boys not only produced speech + speech (S+S) combinations (‘drink juice’) 3 months later than girls, but they also produced gesture + speech (G+S) combinations expressing the same types of semantic relations (‘eat’ + point at cookie) 3 months later than girls. Because G+S combinations are produced earlier than S+S combinations, children’s gestures provide the first sign that boys are likely to lag behind girls in the onset of sentence constructions.
Ozcaliskan, S. & Goldin-Meadow, S., 2009, “When gesture-speech combinations do and do not index linguistic change”, Language and Cognitive Processes, 2009, 28, 190-217. (NIHMS 115848). PDF
At the one-word stage children use gesture to supplement their speech (‘eat’-point at cookie), and the onset of such supplementary gesture-speech combinations predicts the onset of two-word speech (‘eat cookie’). Gesture does signal a child’s readiness to produce two-word constructions. The question we ask here is what happens when the child begins to flesh out these early skeletal two-word constructions with additional arguments. One possibility is that gesture continues to be a forerunner of linguistic change as children flesh out their skeletal constructions by adding arguments. Alternatively, after serving as an opening wedge into language, gesture could cease its role as a forerunner of linguistic change. Our analysts of 40 children - from 14 to 34 months - showed that children relied on gesture to produce the first instance of a variety of constructions. However, once each construction was established in their repertoire, the children did not use gesture to flesh out the construction. Gesture thus acts as a harbinger of linguistic steps only when those steps involve new constructions, not when the steps merely flesh out existing constructions.
Goldin-Meadow, S., Goodrich, W., Sauer, E., & Iverson, J. , 2007, “Young children use their hands to tell their mothers what to say”, Developmental Science, 2007, 10:6, 778-785. PDF
Children produce their first gestures before their first words, and their first gesture+word sentences before their first word+word sentences. These gestural accomplishments have been found not only to predate linguistic milestones, but also to predict them. Findings of this sort suggest that gesture itself might be playing a role in the language-learning process. But what role does it play? Children’s gestures could elicit from their mothers the kinds of words and sentences that the children need to hear in order to take their next linguistic step. We examined maternal responses to the gestures and speech that 10 children produced during the one-word period. We found that all 10 mothers ‘translated’ their children’s gestures into words, providing timely models for how one- and two-word ideas can be expressed in English. Gesture thus offers a mechanism by which children can point out their thoughts to mothers, who then calibrate their speech to those thoughts, and potentially facilitate language-learning.