Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Spencer D. Kelly is active.

Publication


Featured researches published by Spencer D. Kelly.


Psychological Science | 2010

Two Sides of the Same Coin Speech and Gesture Mutually Interact to Enhance Comprehension

Spencer D. Kelly; Eric Maris

Gesture and speech are assumed to form an integrated system during language production. Based on this view, we propose the integrated-systems hypothesis, which explains two ways in which gesture and speech are integrated—through mutual and obligatory interactions—in language comprehension. Experiment 1 presented participants with action primes (e.g., someone chopping vegetables) and bimodal speech and gesture targets. Participants related primes to targets more quickly and accurately when they contained congruent information (speech: “chop”; gesture: chop) than when they contained incongruent information (speech: “chop”; gesture: twist). Moreover, the strength of the incongruence affected processing, with fewer errors for weak incongruities (speech: “chop”; gesture: cut) than for strong incongruities (speech: “chop”; gesture: twist). Crucial for the integrated-systems hypothesis, this influence was bidirectional. Experiment 2 demonstrated that gesture’s influence on speech was obligatory. The results confirm the integrated-systems hypothesis and demonstrate that gesture and speech form an integrated system in language comprehension.


Brain and Language | 2004

Neural correlates of bimodal speech and gesture comprehension

Spencer D. Kelly; Corinne Kravitz; Michael Hopkins

The present study examined the neural correlates of speech and hand gesture comprehension in a naturalistic context. Fifteen participants watched audiovisual segments of speech and gesture while event-related potentials (ERPs) were recorded to the speech. Gesture influenced the ERPs to the speech. Specifically, there was a right-lateralized N400 effect-reflecting semantic integration-when gestures mismatched versus matched the speech. In addition, early sensory components in bilateral occipital and frontal sites differentiated speech accompanied by matching versus non-matching gestures. These results suggest that hand gestures may be integrated with speech at early and late stages of language processing.


Language and Cognitive Processes | 2009

Brief Training with Co-Speech Gesture Lends a Hand to Word Learning in a Foreign Language.

Spencer D. Kelly; Tara McDevitt; Megan Esch

Recent research in psychology and neuroscience has demonstrated that co-speech gestures are semantically integrated with speech during language comprehension and development. The present study explored whether gestures also play a role in language learning in adults. In Experiment 1, we exposed adults to a brief training session presenting novel Japanese verbs with and without hand gestures. Three sets of memory tests (at five minutes, two days and one week) showed that the greatest word learning occurred when gestures conveyed redundant imagistic information to speech. Experiment 2 was a preliminary investigation into possible neural correlates for such learning. We exposed participants to similar training sessions over three days and then measured event-related potentials (ERPs) to words learned with and without co-speech gestures. The main finding was that words learned with gesture produced a larger Late Positive Complex (indexing recollection) in bi-lateral parietal sites than words learned without gesture. However, there was no significant difference between the two conditions for the N400 component (indexing familiarity). The results have implications for pedagogical practices in foreign language instruction and theories of gesture-speech integration.


Language and Linguistics Compass | 2008

Gesture Gives a Hand to Language and Learning: Perspectives from Cognitive Neuroscience, Developmental Psychology and Education

Spencer D. Kelly; Sarah M. Manning; Sabrina Rodak

People of all ages, cultures and backgrounds gesture when they speak. These hand movements are so natural and pervasive that researchers across many fields – from linguistics to psychology to neuroscience – have claimed that the two modalities form an integrated system of meaning during language production and comprehension. This special relationship has implications for a variety of research and applied domains. Gestures may provide unique insights into language and cognitive development, and also help clinicians identify, understand and even treat developmental disorders in childhood. In addition, research in education suggests that teachers can use gesture to become even more effective in several fundamental aspects of their profession, including communication, assessment of student knowledge, and the ability to instill a profound understanding of abstract concepts in traditionally difficult domains such as language and mathematics. This work converging from multiple perspectives will push researchers and practitioners alike to view hand gestures in a new and constructive way.


Journal of Child Language | 2001

Broadening the units of analysis in communication: speech and nonverbal behaviours in pragmatic comprehension

Spencer D. Kelly

Recently, much research has explored the role that nonverbal pointing behaviours play in childrens early acquisition of language, for example during word learning. However, few researchers have considered the possibility that these behaviours may continue to play a role in language comprehension as children develop more sophisticated language skills. The present study investigates the role that eye gaze and pointing gestures play in three- to five-year-olds understanding of complex pragmatic communication. Experiment 1 demonstrates that children (N = 29) better understand videotapes of a mother making indirect requests to a child when the requests are accompanied by nonverbal pointing behaviours. Experiment 2 uses a different methodology in which children (N = 27) are actual participants rather than observers in order to generalize the findings to naturalistic, face-to-face interactions. The results from both experiments suggest that broader units of analysis beyond the verbal message may be needed in studying childrens continuing understanding of pragmatic processes.


Journal of Cognitive Neuroscience | 2010

Integrating speech and iconic gestures in a stroop-like task: Evidence for automatic processing

Spencer D. Kelly; Peter Creigh; James Bartolotti

Previous research has demonstrated a link between language and action in the brain. The present study investigates the strength of this neural relationship by focusing on a potential interface between the two systems: cospeech iconic gesture. Participants performed a Stroop-like task in which they watched videos of a man and a woman speaking and gesturing about common actions. The videos differed as to whether the gender of the speaker and gesturer was the same or different and whether the content of the speech and gesture was congruent or incongruent. The task was to identify whether a man or a woman produced the spoken portion of the videos while accuracy rates, RTs, and ERPs were recorded to the words. Although not relevant to the task, participants paid attention to the semantic relationship between the speech and the gesture, producing a larger N400 to words accompanied by incongruent versus congruent gestures. In addition, RTs were slower to incongruent versus congruent gesture–speech stimuli, but this effect was greater when the gender of the gesturer and speaker was the same versus different. These results suggest that the integration of gesture and speech during language comprehension is automatic but also under some degree of neurocognitive control.


Brain and Language | 2007

An intentional stance modulates the integration of gesture and speech during comprehension

Spencer D. Kelly; Sarah Ward; Peter Creigh; James Bartolotti

The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.


Developmental Neuropsychology | 2002

Putting language back in the body: Speech and gesture on three time frames

Spencer D. Kelly; Jana M. Iverson; Joseph Terranova; Julia Niego; Michael Hopkins; Leslie H. Goldsmith

This article investigates the role that nonverbal actions play in language processing over 3 different time frames. First, we speculate that nonverbal actions played a role in how formal language systems emerged from our primate ancestors over evolutionary time. Next, we hypothesize that if nonverbal behaviors played a foundational role in the emergence of language over evolution, these actions should influence how children learn language in the present. Finally, we argue that nonverbal actions continue to play a role for adults in the moment-to-moment processing of language. Throughout, we take an embodied view of language and argue that the neural, cognitive, and social components of language processing are firmly grounded in bodily action.


Cognition and Instruction | 2002

A Helping Hand in Assessing Children's Knowledge: Instructing Adults to Attend to Gesture

Spencer D. Kelly; Melissa Singer; Janna Hicks; Susan Goldin-Meadow

The spontaneous hand gestures that accompany childrens explanations of concepts have been used by trained experimenters to gain insight into childrens knowledge. In this article, 3 experiments tested whether it is possible to teach adults who are not trained investigators to comprehend information conveyed through childrens hand gestures. In Experiment 1, we used a questionnaire to explore whether adults benefit from gesture instruction when making assessments of young childrens knowledge of conservation problems. In Experiment 2, we used a similar questionnaire, but asked adults to make assessments of older childrens mathematical knowledge. Experiment 3 also concentrated on math assessments, but used a free-recall paradigm to test the extent of the adults understanding of the childs knowledge. Taken together, the results of the experiments suggest that instructing adults to attend to gesture enhances their assessment of childrens knowledge at multiple ages and across multiple domains.


Learning Disability Quarterly | 2001

THE USE OF BRAIN ELECTROPHYSIOLOGY TECHNIQUES TO STUDY LANGUAGE: A BASIC GUIDE FOR THE BEGINNING CONSUMER OF ELECTROPHYSIOLOGY INFORMATION

Dennis L. Molfese; Victoria J. Molfese; Spencer D. Kelly

This article provides a basic background for the professional who is interested in utilizing event-related potential (ERP) approaches to study language processes but has little background in or knowledge about the technique. First, a brief history of the emergence of this technology is presented, followed by definitions, a theoretical overview, and a practical guide to conducting ERP studies. The basis for choice of electrode positions, equipment characteristics (e.g., filter settings), and analyses are also discussed. Finally, examples of language studies that utilize this information in a research study are provided.

Collaboration


Dive into the Spencer D. Kelly's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

R. Breckinridge Church

Northeastern Illinois University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dennis L. Molfese

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Meghan L. Healey

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge