Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Şeyda Özçalışkan is active.

Publication


Featured researches published by Şeyda Özçalışkan.


Autism | 2016

Early deictic but not other gestures predict later vocabulary in both typical development and autism

Şeyda Özçalışkan; Lauren B. Adamson; Nevena Dimitrova

Research with typically developing children suggests a strong positive relation between early gesture use and subsequent vocabulary development. In this study, we ask whether gesture production plays a similar role for children with autism spectrum disorder. We observed 23 18-month-old typically developing children and 23 30-month-old children with autism spectrum disorder interact with their caregivers (Communication Play Protocol) and coded types of gestures children produced (deictic, give, conventional, and iconic) in two communicative contexts (commenting and requesting). One year later, we assessed children’s expressive vocabulary, using Expressive Vocabulary Test. Children with autism spectrum disorder showed significant deficits in gesture production, particularly in deictic gestures (i.e. gestures that indicate objects by pointing at them or by holding them up). Importantly, deictic gestures—but not other gestures—predicted children’s vocabulary 1 year later regardless of communicative context, a pattern also found in typical development. We conclude that the production of deictic gestures serves as a stepping-stone for vocabulary development.


Journal of Autism and Developmental Disorders | 2016

Parents’ Translations of Child Gesture Facilitate Word Learning in Children with Autism, Down Syndrome and Typical Development

Nevena Dimitrova; Şeyda Özçalışkan; Lauren B. Adamson

Typically-developing (TD) children frequently refer to objects uniquely in gesture. Parents translate these gestures into words, facilitating children’s acquisition of these words (Goldin-Meadow et al. in Dev Sci 10(6):778–785, 2007). We ask whether this pattern holds for children with autism (AU) and with Down syndrome (DS) who show delayed vocabulary development. We observed 23 children with AU, 23 with DS, and 23 TD children with their parents over a year. Children used gestures to indicate objects before labeling them and parents translated their gestures into words. Importantly, children benefited from this input, acquiring more words for the translated gestures than the not translated ones. Results highlight the role contingent parental input to child gesture plays in language development of children with developmental disorders.


Cognition | 2016

Does language shape silent gesture

Şeyda Özçalışkan; Ché Lucero; Susan Goldin-Meadow

Languages differ in how they organize events, particularly in the types of semantic elements they express and the arrangement of those elements within a sentence. Here we ask whether these cross-linguistic differences have an impact on how events are represented nonverbally; more specifically, on how events are represented in gestures produced without speech (silent gesture), compared to gestures produced with speech (co-speech gesture). We observed speech and gesture in 40 adult native speakers of English and Turkish (N=20/per language) asked to describe physical motion events (e.g., running down a path)-a domain known to elicit distinct patterns of speech and co-speech gesture in English- and Turkish-speakers. Replicating previous work (Kita & Özyürek, 2003), we found an effect of language on gesture when it was produced with speech-co-speech gestures produced by English-speakers differed from co-speech gestures produced by Turkish-speakers. However, we found no effect of language on gesture when it was produced on its own-silent gestures produced by English-speakers were identical in how motion elements were packaged and ordered to silent gestures produced by Turkish-speakers. The findings provide evidence for a natural semantic organization that humans impose on motion events when they convey those events without language.


Cognition | 2009

Does language about similarity play a role in fostering similarity comparison in children

Şeyda Özçalışkan; Susan Goldin-Meadow; Dedre Gentner; Carolyn Mylander

Commenting on perceptual similarities between objects stands out as an important linguistic achievement, one that may pave the way towards noticing and commenting on more abstract relational commonalities between objects. To explore whether having a conventional linguistic system is necessary for children to comment on different types of similarity comparisons, we observed four children who had not been exposed to usable linguistic input--deaf children whose hearing losses prevented them from learning spoken language and whose hearing parents had not exposed them to sign language. These children developed gesture systems that have language-like structure at many different levels. Here we ask whether the deaf children used their gestures to comment on similarity relations and, if so, which types of relations they expressed. We found that all four deaf children were able to use their gestures to express similarity comparisons (point to cat+point to tiger) resembling those conveyed by 40 hearing children in early gesture+speech combinations (cat+point to tiger). However, the two groups diverged at later ages. Hearing children, after acquiring the word like, shifted from primarily expressing global similarity (as in cat/tiger) to primarily expressing single-property similarity (as in crayon is brown like my hair). In contrast, the deaf children, lacking an explicit term for similarity, continued to primarily express global similarity. The findings underscore the robustness of similarity comparisons in human communication, but also highlight the importance of conventional terms for comparison as likely contributors to routinely expressing more focused similarity relations.


Psychological Science | 2016

Is Seeing Gesture Necessary to Gesture Like a Native Speaker

Şeyda Özçalışkan; Ché Lucero; Susan Goldin-Meadow

Speakers of all languages gesture, but there are differences in the gestures that they produce. Do speakers learn language-specific gestures by watching others gesture or by learning to speak a particular language? We examined this question by studying the speech and gestures produced by 40 congenitally blind adult native speakers of English and Turkish (n = 20/language), and comparing them with the speech and gestures of 40 sighted adult speakers in each language (20 wearing blindfolds, 20 not wearing blindfolds). We focused on speakers’ descriptions of physical motion, which display strong cross-linguistic differences in patterns of speech and gesture use. Congenitally blind speakers of English and Turkish produced speech that resembled the speech produced by sighted speakers of their native language. More important, blind speakers of each language used gestures that resembled the gestures of sighted speakers of that language. Our results suggest that hearing a particular language is sufficient to gesture like a native speaker of that language.


Seminars in Speech and Language | 2013

How gesture input provides a helping hand to language development

Şeyda Özçalışkan; Nevena Dimitrova

Children use gesture to refer to objects before they produce labels for these objects and gesture-speech combinations to convey semantic relations between objects before conveying sentences in speech--a trajectory that remains largely intact across children with different developmental profiles. Can the developmental changes that we observe in children be traced back to the gestural input that children receive from their parents? A review of previous work shows that parents provide models for their children for the types of gestures and gesture-speech combinations to produce, and do so by modifying their gestures to meet the communicative needs of their children. More importantly, the gestures that parents produce, in addition to providing models, help children learn labels for referents and semantic relations between these referents and even predict the extent of childrens vocabularies several years later. The existing research thus highlights the important role parental gestures play in shaping childrens language learning trajectory.


Journal of Cognition and Development | 2017

Early Gesture Provides a Helping Hand to Spoken Vocabulary Development for Children with Autism, Down Syndrome, and Typical Development.

Şeyda Özçalışkan; Lauren B. Adamson; Nevena Dimitrova; Stephanie Baumann

ABSTRACT Typically developing (TD) children refer to objects uniquely in gesture (e.g., point at a cat) before they produce verbal labels for these objects (“cat”). The onset of such gestures predicts the onset of similar spoken words, showing a strong positive relation between early gestures and early words. We asked whether gesture plays the same door-opening role in word learning for children with autism spectrum disorder (ASD) and Down syndrome (DS), who show delayed vocabulary development and who differ in the strength of gesture production. To answer this question, we observed 23 18-month-old TD children, 23 30-month-old children with ASD, and 23 30-month-old children with DS 5 times over a year during parent–child interactions. Children in all 3 groups initially expressed a greater proportion of referents uniquely in gesture than in speech. Many of these unique gestures subsequently entered children’s spoken vocabularies within a year—a pattern that was slightly less robust for children with DS, whose word production was the most markedly delayed. These results indicate that gesture is as fundamental to vocabulary development for children with developmental disorders as it is for TD children.


Journal of Autism and Developmental Disorders | 2017

Do Verbal Children with Autism Comprehend Gesture as Readily as Typically Developing Children

Nevena Dimitrova; Şeyda Özçalışkan; Lauren B. Adamson

Gesture comprehension remains understudied, particularly in children with autism spectrum disorder (ASD) who have difficulties in gesture production. Using a novel gesture comprehension task, Study 1 examined how 2- to 4-year-old typically-developing (TD) children comprehend types of gestures and gesture–speech combinations, and showed better comprehension of deictic gestures and reinforcing gesture–speech combinations than iconic/conventional gestures and supplementary gesture–speech combinations at each age. Study 2 compared verbal children with ASD to TD children, comparable in receptive language ability, and showed similar patterns of comprehension in each group. Our results suggest that children comprehend deictic gestures and reinforcing gesture–speech combinations better than iconic/conventional gestures and supplementary combinations—a pattern that remains robust across different ages within TD children and children with ASD.


Metaphor and Symbol | 2013

Teasing Apart the Role of Cognitive and Verbal Factors in Children's Early Metaphorical Abilities

Lauren J. Stites; Şeyda Özçalışkan

Metaphor plays a unique role in cognitive development by structuring abstract concepts and leading to conceptual change. Existing work suggests early emergence of metaphorical abilities, with five-year-olds understanding and explaining metaphors that involve cross-domain comparisons (e.g., SPACE to TIME). Yet relatively little is known about the factors that explain this developmental change. This study focuses on spatial metaphors for time, and asks whether cognitive and/or verbal factors best explain developmental changes in three- to six-year-old childrens comprehension and explanation of metaphors. The results show that childrens grasp of the time concept—but not verbal ability—predicts their metaphor comprehension. Verbal ability, on the other hand, is a predictor of metaphor explanation, even after controlling for age. The results thus suggest that cognitive and verbal factors selectively predict childrens emerging metaphorical abilities.


Journal of Experimental Child Psychology | 2018

Type of iconicity influences children’s comprehension of gesture

Leslie Hodges; Şeyda Özçalışkan; Rebecca A. Williamson

Children produce iconic gestures conveying action information earlier than the ones conveying attribute information (Özçalışkan, Gentner, & Goldin-Meadow, 2014). In this study, we ask whether childrens comprehension of iconic gestures follows a similar pattern, also with earlier comprehension of iconic gestures conveying action. Children, ages 2-4years, were presented with 12 minimally-informative speech+iconic gesture combinations, conveying either an action (e.g., open palm flapping as if bird flying) or an attribute (e.g., fingers spread as if birds wings) associated with a referent. They were asked to choose the correct match for each gesture in a forced-choice task. Our results showed that children could identify the referent of an iconic gesture conveying characteristic action earlier (age 2) than the referent of an iconic gesture conveying characteristic attribute (age 3). Overall, our study identifies ages 2-3 as important in the development of comprehension of iconic co-speech gestures, and indicates that the comprehension of iconic gestures with action meanings is easier than, and may even precede, the comprehension of iconic gestures with attribute meanings.

Collaboration


Dive into the Şeyda Özçalışkan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leslie Hodges

Georgia State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge